Thursday, October 4, 2018

The state of Artificial Intelligence

It seems that a day doesn't go by without artificial intelligence (AI) making the headlines. On the one hand, we are being warned about the potential negative impact of AI on the job market and on the other hand we are being made to believe that an AI-driven techno-utopia is right around the corner. The truth is, as is usually the case, far from black and white.


How We Got Here 

If you enter the term "artificial intelligence" into Google Trends, you'll notice a sharp uptake in search interest around the beginning of 2017. "I'd say in 2017 we reached the tipping or inflection point when it comes to enterprise adoption of AI technologies", says Kashyap Kompella, a contributing analyst at Real Story Group.

"We saw enterprise use cases emerge in text analytics, natural language processing, voice recognition, image recognition, and, of course, structured data analysis in a variety of domains." Indeed, 2017 was the year AI became a household name, but its story starts much earlier than that.

Thought-capable artificial beings can be found in tales dating back to antiquity, and the word "robot" was introduced in 1921 in a science fiction play titled R.U.R. (Rossum's Universal Robots) by the Czech writer Karel Čapek. The first work that we now recognize as an example of AI comes from 1943, and the field of AI research was born in 1956 at Dartmouth College.

Despite the lavish aspirations of AI researchers during the post-war era, progress soon slowed down to a crawl and real-world use cases were nowhere to be found. That changed in the late 1990s thanks to increasing computational power and greater emphasis on solving specific problems. In 1997, IBM reminded the whole world that computers had been getting smarter when its chess-playing computer Deep Blue defeated Garry Kasparov in the game of chess.

In 2011, another super-computer created by IBM, Watson participated in a "Jeopardy!" quiz show exhibition match and defeated two of the show's greatest champions by a significant margin. What made the spectacle especially memorable was the fact that Watson could not only understand questions posed in natural language but also respond to them using a voice synthesized in real-time, something that had seem unthinkable just a decade ago.

Today, just a few years after Watson's incredible performance, computers that understand questions posed in natural language and respond to them in real-time using a synthesized voice are everywhere, including in our pockets. From Apple's Siri to Amazon's Alexa to Microsoft's Cortana or Google's virtual assistant, the technology that was once a dream is now a reality, and we have all helped to make it happen. 


Big Data Revolution

"I've been working in AI for now more than 30 years and in the past eight years there are things that have occurred that I never thought would happen in my lifetime", said Adam Cheyer, co-founder of Viv Labs. "I believe within the next two to three years you will use an assistant for everything you use the web and apps for and it will be better than either of them."

The reason why AI is suddenly so much more useful despite having been around for many decades has everything to do with the digitization of virtually every aspect of our lives. We have arrived at a point where we create around 2.5 quintillion bytes of data each day. Over the last two years alone, we have generated 90 percent of all the data that was ever generated in the world, and our data output will only increase in the future.

The enormous quantities of data we produce, collectively referred to as big data, have been instrumental in an area of AI called deep learning. "The [deep learning] system basically learns by itself using a lot of data and computation. If you keep showing it pictures of an orange, it eventually figures out what an orange is - or a Chihuahua versus a Labrador versus a small pony", explains Nvidia CEO Jensen Huang.

According to an analysis of more than 400 use cases across 19 industries and nine business functions by McKinsey & Company, deep learning is among the AI techniques with the greatest likelihood to be used in real-world AI application. The analysis further suggests that on average deep learning AI techniques have the potential to provide a boost in additional value above traditional analytics techniques ranging from 30 percent to 128 percent, depending on the industry.

Current Use Cases

Use cases for AI have emerged in nearly every industry and some industries are already being transformed by it:

Automotive: 

Just a few years ago, it would be unthinkable for a vehicle to drive by itself, but there are now multiple car manufacturers that offer vehicles with self-driving capabilities. Tesla's fleet has accumulated over a billion of miles of autonomous driving and the company is getting ready for its first autonomous coast-to-coast drive. It will still take some time before self-driving cars are seen as first-class citizens on the road, but mining companies have been relying on fleets of autonomous mining trucks as an efficient way to mitigate the effects of widespread labor shortages in the mining industry for years now. 

Customer support: 

It's estimated that the global chatbot market will be worth around 1.3 billion by 2025, up from only 190 million in 2016. Chatbots, computer programs which conduct a conversation via auditory or textual methods, are quickly becoming a preferred way to get customer support because they are convenient to use, are available around the clock, and don't require users to install an additional application to interact with them. Companies such as Facebook, Lyft, Fandango, Spotify, Whole Foods, Sephora, Mastercard, Staples, The Wall Street Journal, or Marriott International are all using chatbots to improve their customer support.  

Healthcare: 

AI has many use cases in healthcare. For example, Qure’s deep learning algorithms have been trained with over a million curated X-rays and radiology reports to detect abnormal chest X-rays, screen for tuberculosis, and identify common abnormalities. The system has been successfully used in public health screen programs, reaching up to 99 percent accuracy. AI is also likely to play an important role in medical diagnosis. Around 10 percent of patient deaths and between 6 and 17 percent of all hospital complications are attributed to diagnostic errors, and automated diagnostic tools could be trained to provide a second opinion and warn about potential misdiagnosis. 

Manufacturing: 

From real-time maintenance of equipment using virtual models that pair the virtual and physical worlds and allow analysis of data and monitoring of systems to head off problems before they even occur to generative design involving a program capable of all the possible permutations of a solution that meet certain constraints, AI has many use cases in manufacturing. The trend of automation and data exchange in manufacturing even has a name: Industry 4.0, commonly referred to as the fourth industrial revolution.


Roadblocks to AI Adoption and Evolution

All the above-mentioned AI use cases are achieved by highly specialized algorithms that excel at performing tasks that take humans just a second to perform. However, the same algorithm can't accomplish anything beyond what any toddler can do.

"When an app can look at the cancer diagnosis and provide results better than a human doctor, when an autonomous car can … control the car in a safer way than a human, when a weapon system is able to guide itself without the aid of the pilot -when it can only do that one task with reliability and accuracy -that is narrow AI", said Amir Husain, founder and CEO of SparkCognition and author of The Sentient Machine: The Coming Age of Artificial Intelligence. "Today, that is where we are poised. That is not to say we aren't going toward more generalized AI."

Artificial general intelligence, or AGI for short, is the intelligence that would allow machines to successfully perform any intellectual task that a human can. Currently, over 40 organizations around the world are actively researching AGI but there are still many challenges to overcome to even come close to it.

To put things into perspective, consider that leading speech recognition systems are built using about 50,000 hours of audio data and corresponding audio transcripts. Before the computer program AlphaGo became the first computer program to defeat a world champion at the ancient Chinese board game Go, several leading AI experts had been working on it for multiple years.

Still, progress is being made, as was demonstrated at the International Conference for Learning Representations (ICLR) in Vancouver, British Columbia. Researchers have, for example, described a technique that allowed a robot to tie a knot and navigate an office with no specific guidance and after only one demonstration. It's important to remember that it was unthinkable for AI to do just a fraction of the things it does today just a few years ago, so who knows what the future might bring. 


Conclusion

Artificial intelligence has evolved significantly over the last few years and found application in virtually every industry. Many experts believe that it will have the same impact on the world as electricity had, but it's impossible to estimate what the timeline may be. 

At the moment, we are still far away from the general intelligence imagined by early sci-fi authors, but technology is known to progress at an exponential rate.  


Sources














Friday, May 18, 2018

The present and future of Quantum Computing

Google, Intel, IBM, Alibaba, and many other established companies and startups alike are racing to see who can overcome the limitations that the nature itself imposes on the current semiconductor fabrication process technology. Their hope is to enter the quantum realm, where the laws of physics don’t work as we’re used to and develop the next generation of supercomputers: quantum computers. 



The End of Moore's Law

Ever since Gordon Moore, the co-founder of Fairchild Semiconductor and Intel, observed in 1965 that the number of components per integrated circuit doubles roughly every two years, the advancements in digital electronics have been following a predictable pattern.

In 2010, the International Technology Roadmap for Semiconductors predicted a slowdown of the trend, and Moore himself foresaw the end of an era when he stated that Moore's law would be dead in the next decade or so.

"Now we're getting to the point where it's more and more difficult, and some of the laws are quite fundamental", said Moore in an interview in 2015. "We're very close to the atomic limitation now. We take advantage of all the speed we can get, but the velocity of light limits performance. These are fundamentals I don't see how we [will] ever get around. And in the next couple of generations, we're right up against them."

Currently, most computer chips are being manufactured using the 14-nanometer lithography process. For comparison, the human immunodeficiency virus (HIV) is roughly spherical with a diameter of about 120 nm. The next step is the 10-nanometer lithography process. At this size, the wires inside integrated circuits become so small that the barrier layer takes up most of the interconnect, leaving less space for the copper itself. 

While approaching the size of an atom is extremely difficult, getting past it is a whole another story, which is where quantum computing comes in.

The Beginning of Quantum Era

"The exciting thing about quantum computers is that they work fundamentally differently from today's computers", explains Terry Hickey, IBM Global Business Services Business leader for Cognitive & Analytics.

In traditional computers, bits are the smallest units of information. Each bit can have one of two possible values: a one or a zero. As such, a bit roughly corresponds to a switch with one on position and one off position.

"In contrast, a quantum computer makes use of quantum bits, or qubits, where each can represent a one, a zero, or both at once, which is known as superposition. This property along with other quantum effects enable quantum computers to perform certain calculations vastly faster than classic computers", says Hickey. 

A qubit can be any two-level quantum system, such as the polarization encoding of a photon, the electronic spin of an electron, or the nuclear spin of a nucleus. "We can also prepare qubits in a quantum superposition of 0 and 1 and create nontrivial correlated states of a number of qubits, so-called 'entangled states'", says Alexey Fedorov, a physicist at the Moscow Institute of Physics and Technology.

Entangled states are closed connections that make each of the qubits react to a change in the other's state instantaneously, no matter how far apart they are. This quantum phenomenon makes it possible to directly deduce properties of all entangled qubits by measuring just one entangled qubit, allowing quantum computers to store massive amounts of information using less energy than traditional computers.

Quantum Supremacy

Quantum computers are exciting and promising, but they're still outperformed by machines that rely on the same manufacturing process that has fueled the information age - so far. We're rapidly approaching a point called quantum supremacy, which is when quantum machines will be able to do calculations that existing computers aren't capable of processing.

"Hopes of reaching quantum supremacy have been dashed before. For some time, researchers thought that a 49-qubit machine would be enough, but last year researchers at IBM were able to simulate a 49-qubit quantum system on a conventional computer", write Martin Giles and Will Knight.

Intel unveiled its 49-qubit quantum computer early in 2018, at the International Consumer Electronics Show (CES), falling just one qubit behind IBM's 50-qubit prototype quantum computer. In March 2018, however, Google took the quantum crown when the company introduced its new quantum processor, codenamed Bristlecone, with 72 qubits.

"We are cautiously optimistic that quantum supremacy can be achieved with Bristlecone, and feel that learning to build and operate devices at this level of performance is an exciting challenge! We look forward to sharing the results and allowing collaborators to run experiments in the future", posted Julian Kelly, Research Scientist at Google's Quantum AI Lab.

"Operating a device such as Bristlecone at low system error requires harmony between a full stack of technology ranging from software and control electronics to the processor itself. Getting this right requires careful systems engineering over several iterations."

However, even Google's quantum workhorse has nowhere near as many qubits as best D-wave systems. This Vancouver-based company sells a quantum computer chip, called D-Wave 2000Q, with 2048 physical qubits.

"With 2000 qubits and new control features, the new system can solve larger problems than was previously possible, with faster performance, providing a big step toward production applications in optimization, cybersecurity, machine learning, and sampling", stated D-Wave in an official announcement.

The first customer for the D-Wave 2000Q quantum computer chip was Temporal Defense Systems Inc. (TDS), a cutting-edge cybersecurity firm. Many researchers, however, believe that the D-wave systems are not true quantum computers because the underlying ideas for the D-Wave approach arose from experimental results in condensed matter physics, not from the conventional quantum information field.

Use Cases for Quantum Computing

A research report from Morgan Stanley predicts quantum computing to double the high-end computing market from 5 billion USD to 10 billion USD. "Quantum computing is at an inflection point- moving from fundamental theoretical research to an engineering development phase, including commercial experiments", the report states.

According to Morgan Stanley, traditional transistor won't go to Silicon Heaven any time soon because quantum computing is not suited to all compute tasks. The vast majority of devices, including smartphones, personal computers, and web servers will continue to run on current technology, but a small subset of high-end compute platform may start its transition to quantum technology not long from now, around 2025.

"The classical computer is a large calculator that is very good at calculus and step-by-step analytics, whereas the quantum computer looks at solving problems from a higher point of view. Quantum computing does not make the classic computer irrelevant-smartphones and laptops will still use transistors for the foreseeable future, and the transition might take several years."

Even though quantum computers are not yet ready for primetime, we can already identify several use cases for quantum computing, including those described below. 

Cryptography

Today's cryptography is extremely secure against brute force attacks executed using traditional computers. For example, it would take even the most powerful supercomputer in the world such an absurdly huge amount of time to break a symmetric 256-bit key that even our sun wouldn't be here to finally see someone succeed.

But quantum computers operate on different principles to traditional computers, and they are especially great for solving particular mathematical problems, such as those underlying modern cryptography.

"For public key cryptography, the damage from quantum computers will be catastrophic", said Lily Chen, mathematician and leader of the National Institute of Standards and Technology's Cryptographic Technology Group, in a session at the American Association for the Advancement of Science's 2018 annual meeting in Austin, Texas.

One possible approach how to secure cryptography against quantum computers might involve the use of quantum-based cryptographic systems. As such, the very technology that could render today's cryptography obsolete could also give us the cryptography of tomorrow. 

Simulations

In September 2017, the peer-reviewed academic journal of the American Association for the Advancement of Science, reported that a quantum computer simulated the behavior of beryllium hydride. While beryllium hydride is a relatively simple molecule, its simulation is seen by physicists and chemists as a step toward a new way to discover drugs and materials.

"Researchers are also excited about the prospect of using quantum computers to model complicated chemical reactions, a task that conventional supercomputers aren't very good at all", write Abigail Beall and Matt Reynolds in their Wired article. "Eventually, researchers hope they'll be able to use quantum simulations to design entirely new molecules for use in medicine."

Simulations run on quantum computers could also make existing treatments, such as radiation therapy, more effective. Current software and hardware used by medical dosimetrist to calculate the best possible radiation dose cannot guarantee optimal results as there are far too many variables to take into account. Quantum computers of the future could, however, do just that and make this and many other existing treatments far more effective. 

Artificial Intelligence

"The promise is that quantum computers will allow for quick analysis and integration of our enormous data sets which will improve and transform our machine learning and artificial intelligence capabilities", writes internationally best-selling author and keynote speaker Bernard Marr.

According to Seth Lloyd, a physicist at the Massachusetts Institute of Technology and a quantum-computing pioneer, 60 qubits should be enough to encode an amount of data equivalent to that produced by humanity in a year.

The ability to store and process massive amounts of data makes quantum computers perfect for machine-learning techniques, which are responsible for getting computers to act without being explicitly programmed.

Conclusion

There are still many challenges for quantum computing, but researchers across multiple fields have been able to achieve great success and overcome obstacles that would seem insurmountable not too long ago. We don't know yet exactly what advancements quantum computing will enable. It seems, however, that it could be a major driving element of a fourth industrial revolution and usher in an era of rapid productivity growth. 

Sources















Monday, April 30, 2018

How to become a professional software developer

Employment of software developers is growing faster than the average for all occupations. The Bureau of Labor Statistics projects a growth of 24 percent from 2016 to 2026, which translates into 302,500 new job openings for software developers.

Despite the abundance of job openings for software developers, only around 3 percent of college students graduate with a degree in computer science, according to Computer Science Education WeekWhat is perhaps even more surprising is the fact that only 8 percent of graduates are in computer science.

One possible explanation for the discrepancy between the high demand for software developers and the low supply of recent graduates who want to become professional software developers is the seemingly thorny path that leads to many attractive software development jobs.

Many people believe that the median annual wage for software developers, which is currently around USD 100,000.00, is reserved for a small handful of extraordinarily talented developers who have been programming since childhood and have a natural aptitude for math.

In this article, I try to dispel some of the myths that surround the profession and lay down a clear path for aspiring developers to follow.



Do I need a degree to become a professional software developer?

In 2012, Microsoft published a study which showed that less than one-fourth to less than one-half of workers in computing occupations have a computer science degree. As noted back in 1998 by the Government Accounting Office, "IT workers come from a variety of educational backgrounds and have a variety of educational credentials such as master’s degrees, associate degrees, or special certifications."

In fact, 36 percent of IT workers do not hold a college degree at all, as explained in a 2013 paper from Economic Policy Institute, which reviewed and analyzed the STEM labor market and workforce. Furthermore, a 2015 software developer survey conducted by Stack Overflow, an online community where software developers and others ask tough questions and get answers from their peers, revealed that over 40 percent of software developers are self-taught.

Considering these numbers, it is clear that one does not need a degree to become a professional software developer. As I'll explain more in the following section of this article, software developers are judged mainly by their skills and professional experience and not by their educational background.

Of course, having a computer science degree from a prestigious university can increase one's chances of landing a great job, but it is just one of many things to which recruiters and interviewers pay attention.

Do you have to be good at math to be a good software developer?

Every day, there is a new thread on some programming forum posted by someone wondering whether they need to be good at math to be good at developing software. The answer to this question is simple: no. But it takes more than one word to get to the bottom of the issue.

There is a good reason why people who are good at math have a natural inclination toward software development. Both math and programming are about breaking down complex problems into smaller parts and spotting patterns.

Or you can just take the word of Richard P. Gabriel, an American computer scientist who is known for his work related to the Lisp programming language and his essay Lisp: Good News, Bad News, How to Win Big, who said, "Programmers are not mathematicians, no matter how much we wish for it."

But just because software developers are not mathematicians does not mean that math has no place in software development - far from it! There are many areas of software development, such as data cleaning, data aggregation, anomaly detection, game development, 3D imaging, or graphics in general, where a solid grasp of fairly complex math (variable calculus, discrete mathematics, derivatives, integrals, statistics, many physical equations, just to give a few examples) is essential. 

Lalit Kundu, who works in Google's core payments backend team as a software engineer, provides some examples of the math he used since he started working for Google in his post on Quora. "Depending on the team and field you’re working in, the amount of maths you’ve to know could range from high school algebra to graduate level stuff", Lalit says.

Because the world of software development is so complex and large, there is a place for every software developer regardless of their knowledge of math. The key is to be honest during the interview process to avoid getting a job that would be better suited for someone with a different skillset.

Which programming language should I learn?

Wikipedia currently lists around 750 programming languages. Needless to say, only a fraction of the languages listed on Wikipedia are used in practice, and even fewer are useful for becoming a professional software developer.

Every year, GitHub, a web-based hosting service for version control that provides access control and several collaboration features such as bug tracking, feature requests, task management, and wikis for every project hosted there, publishes its State of the Octoverse report, showing, among other things, which programming languages are on the rise, and which are on their way to obsolescence.

In 2017, the top 15 most popular languages on GitHub were JavaScript, Python, Java, Ruby, PHP, C++, CSS, C#, Go, C, TypeScript, Shell, Swift, Scala, and Objective-C, in this order. 

With 2.3 million pull requests, JavaScript is more than twice as popular as the second most popular programming language, which is Python. The fifth most popular programming language, PHP, is more than four times less popular than JavaScript, and the tenth most popular programming language, C, is almost ten times less popular than JavaScript. 

However, only because a certain programming language is popular, does not necessarily mean it’s sought after by employers, nor does it make it the best choice for solving every problem. In other words, if you plan a lifelong career as a software developer, be prepared to learn multiple programming languages. Once you learn just two languages, you realize that it does not really matter all that much which language you study first, or even which language you choose as your primary, because software development is more about being able to divide complex problems into smaller and simpler pieces and understanding popular software development practices and patterns, which comes with practice and experience.

As software engineer Mahesh Mathapati excellently puts it in his post on Quora, "[Programming languages] are mere mediums to convert logic into instructions which computers can understand. Don't worry about them now, you will learn them in 4 years engineering course. The real gems are your aptitude, analytical skills, basic mathematics, understanding of the problem, good communication skills to explain yourself to colleagues and clients, etc."

How can I gain programming experience?

Companies do not expect software developers applying for a job to have a computer science degree, but they do expect them to have some kind of proof that they are qualified.

Arguably the best way how a software developer can demonstrate their skills, and improve them at the same time, is to build software. 

Personal projects that solve real problems in a demonstrable way and earn praise from the wider developer community are an excellent way to impress an interviewer and land a job. Some software developers take things a step further and purposefully build software with the same software stack the company they would like to land a job at uses.

Besides projects, internships are also a great way to gain hands-on programming experience and learn how teams of professional software developers function. While paid internships are becoming harder and harder to come by, even unpaid internships could pay off further down the road. 

How to stand out and get a job?

Having excellent software development skills is not enough to stand out from other equally skilled software developers. While working on projects and learning various programming languages, software developers who want to make it big in the industry should also focus on cultivating their professional network.

Software developers gather in places such as Stack Overflow, GitHub, Reddit, LinkedIn, Medium, or Quora, where they ask questions, submit answers, post blog posts, and publish source code. Having a solid online presence can be a huge advantage when looking for a job since 40 percent of new hires come via employee referral, according to Jobvite.

Last but definitely not least, software developers should not forget about their soft skills. Professional software development is a team effort, and there is very limited space for software developers who struggle to express themselves and cooperate with others.  

Conclusion

More people can become software developers than they realize. One does not need a computer science degree to get a job as a software developer, and one certainly does not need to be a math whiz to develop software applications used by millions of people around the world. 

What is needed is an aptitude for breaking down complex problems into smaller parts and the willingness to constantly learn new things in order to keep up with new technologies and thinking models.

Sources














Sunday, April 29, 2018

Compliance with EU-DSGVO / GDPR

Dear readers of my blog,

in order to comply with the EU-DSGVO / GDRP I unfortunately had to disable some functionality, notably the option to leave comments and the social media share-buttons below my blog posts.

The new regulation requires a double-opt-in mechanism for these functions that I cannot implement on this blog without notable effort.

My site is completely ad-free and has no kind of monetarization. I can only attend to it in my spare time and I do not want to risk any legal issues.

Thanks for your understanding.

Friday, March 9, 2018

Status Quo of #Blockchain

Not even 10 years after an unknown person or group of people under the name Satoshi Nakamoto invented Bitcoin, the first cryptocurrency in the world, the technology underpinning Nakamoto's creation is everywhere, promising to change the world as we know it. 

That technology is called blockchain, and Bitcoin uses this innovative ledger to record transactions that happen on its peer-to-peer, decentralized network. But financial record-keeping is not blockchain’s only forte. What makes blockchain special is its ability to ensure immutability without any trusted authority.

This key property of blockchain technology makes it suitable for a number of different applications, the most promising of which we explore in this article. But before we get to how blockchain technology could lead to the global disruption of established institutions and entire industries, we find it appropriate to take a closer look at blockchain's present state.



Blockchain has come a long way

Mike Gault, the CEO of Estonia's technology vendor Guardtime, defines blockchain as "an append-only data structure that contains data records that are cryptographically linked together." According to Gault's definition, "Data records are added to the data structure when multiple distributed parties come to a consensus based on pre-agreed rules." Gault has provided this definition in response to the rapidly growing number projects that are developing blockchain-based solutions that widely differ from the blockchain Satoshi Nakamoto created for his disruptive cryptocurrency.

Bitcoin as well as Ethereum, the most popular blockchain-based computing platform in the world, allow for the creation of the so-called Decentralized Applications, or DApps for short. At the time of writing this article, there are almost 1,100 DApps listed on the State of the DApps website. Confusingly enough, Bitcoin, Ethereum, and other altcoins, which are alternative cryptocurrencies launched after the success of Bitcoin, are also DApps because their backend code runs on a decentralized peer-to-peer network.

For example, Augur, one of the most popular Ethereum-based DApps, is using a blockchain-based betting system and the knowledge of the masses to create a forecasting tool capable of accurately predicting the outcomes of real-world events. 
What Augur and many other DApps illustrate is just how incredibly far blockchain technology has come in less than a decade since it first gained traction. Today, there are several large blockchain enterprise alliances, including Enterprise Ethereum Alliance and its 250+ members, Hyperledger, and R3, and annual revenue for the enterprise blockchain market is expected to increase from approximately 2.5bn USD worldwide in 2016 to 19.9bn USD by 2025, with a CAGR of 26.2%, according to the State of Blockchain report from Coinbase

Non-Crypto applications of Blockchain technology

While cryptocurrencies, and their wild price fluctuations, are still the main reason why blockchain technology makes headlines, non-crypto applications of blockchain technology make futurologists excited the most. 

Contract Negotiation 

Blockchain technology allows for the execution of smart contracts, which are computer programs intended to digitally facilitate, verify, or enforce the negotiation or performance of a contract. Smart contracts were coined by computer scientist Nick Szabo before Satoshi Nakamoto created Bitcoin. Many experts expect smart contracts to replace traditional paper contracts, naming complete immutability and enhanced transparency as two biggest advantages of this nascent technology.

Because smart contracts are programmable and self-executing, they can be triggered automatically when a set of preconditions are met. For example, a smart contract between a customer and an e-commerce store could replace a third-party escrow service.

Supply Chain Management 

Blockchain could be a game-changer for supply chain management as traceability and transparency are among the most important foundations of logistics. "Blockchain offers a shared ledger that is updated and validated in real time with each network participant. It enables equal visibility of activities and reveals where an asset is at any point in time, who owns it and what condition it's in", IBM explains some of the benefits of supply chain management with blockchain.

The pharmaceutical industry has been struggling to secure its supply chain and prevent fake drugs from entering the market for a long time, and it is now starting to see blockchain technology as a potential solution to its problems. Pistoia Alliance survey revealed that 83% of the leaders in life sciences think that blockchain will be adopted as a data housing solution in the next 5 years and 22% of life science companies are already using or experimenting with blockchain.

Record Keeping 

Because blockchain is essentially a distributed database, it can be used for all kinds of record keeping. In 2017 Dubai's land registrar has revealed it is developing a system that would seek to record all local real estate contracts on a blockchain, according to Coindesk. Dubai's aim is to unite all real estate and department services on a single blockchain-based platform and achieve this goal by the year 2020.

Dubai hopes that the move to blockchain will save the city money in the long run as an average person will move to a new location nearly 12 times in their lifetime, and each move is not legally possible without someone pulling information from the system and putting new information in. Blockchain would open Dubai's real estate market to investors around the world, giving them a way how to verify property data. It would also connect renters with landlords and property-related billers.

Another form of record-keeping to which blockchain technology can be applied is the management of digital identities, including passports, birth certificates, wedding certificates, personal IDs, and even online accounts.

Cloud Storage

Right now, virtually all cloud storage services are centralized, which means that the users must place their trust in a single cloud storage provider. Blockchain could disrupt the existing cloud storage model and usher in an era of decentralized cloud storage services with enhanced security and guaranteed ownership of personal data.

One such cloud storage service, Storj, is currently preparing for launch, promising 99.99999% availability, fully distributed infrastructure, end-to-end encryption, and extremely low prices per GB of stored data.

Digital Voting

"Many states use voting machines that are over 10 years old that are not only antiquated and failing, they are also becoming increasingly expensive to maintain as parts are no longer manufactured. Election fraud undermines the very fabric of democracy" stated Blockchain Technologies Corp, a company that wants to replace existing voting machines with blockchain-based voting systems.

Estonia has already implemented a unique e-voting platform that uses blockchain technology to guarantee flawless availability, transparency, and an unprecedented level of fraud protection. Many other countries around the world are currently considering implementing systems based on the one used by the government of Estonia.

Potential roadblocks to the wider adoption of Blockchain

Blockchain technology will be able to fully realize its potential only if it overcomes the most serious roadblocks to its wider adoption.

Regulations

The soon-to-be-implemented General Data Protection Regulation (GDPR) by which the European Parliament, the Council of the European Union, and the European Commission intend to harmonize data privacy laws across Europe, to protect and empower all EU citizens data privacy, and to reshape the way organizations across the region approach data privacy  is one such roadblock.

The GDPR stands in the way of wider blockchain adoption for two main reasons: because of the fact that personal data is not to leave the EU, and because it demands that user's personal data is rectified or deleted under many circumstances.

According to Jan Philipp Albrecht, the member of the European Parliament who shepherded the regulation, the GDPR "is agnostic about which specific technology is used for the processing, but it introduces a mandatory obligation for data controllers to apply the principle of data protection by design. This means, for example, that the data subject’s rights can be easily exercised, including the right to deletion of data when it is no longer needed."

Considering that immutability is a key characteristic of blockchain technology, to say that it does not mix well with the GDPR would be an understatement. "Certain technologies will not be compatible with the GDPR" said Albrecht. "This does not mean that blockchain technology, in general, has to adapt to the GDPR, it just means that it probably cannot be used for the processing of personal data."

Scalability

In its present form, blockchain technology is severely limited by its insufficient scalability. Visa is known to handle around 2,000 transactions per second on average and up to 4,000 transactions per second during high shopping periods. The total capacity of the entire Visa network is expected to be over 50,000 transactions per second.

PayPal handles considerably fewer transactions every second, over 100, but that is still a lot compared to the maximum number of transactions per second of Ethereum: only 15. With the current consensus rules, Bitcoin can only process around 7 transactions every second.

"The Ethereum community, key developers and researchers and others have always recognized scalability as perhaps the single most important key technical challenge that needs to be solved in order for blockchain applications to reach mass adoption" writes Vitalik Buterin, the mastermind behind Ethereum. "Blockchain scalability is difficult primarily because a typical blockchain design requires every node in the network to process every transaction, which limits the transaction processing capacity of the entire system to the capacity of a single node."

Conclusion

Blockchain technology is evolving rapidly, and there have already been a number of practical applications, with considerably more  currently being in various stages of development. If blockchain developers succeed in overcoming the most serious roadblocks to its wider adoption, blockchain could become one of the most transformative technologies of the 21st century - if not the most transformative. 

Sources














Tuesday, February 13, 2018

Secrets to Successful Team Management

Behind every strong company are strong teams. And behind every strong team is a capable team manager who directs the team's focus and knows how to create an environment that allows individual team members to perform at their peak potential.

The focus of this article are the numerous techniques used by successful team managers to help their team grow and overcome even the most difficult challenges. When applied correctly, the techniques and strategies described in this article can have a significant positive effect on a company’s bottom line, productivity, and culture. 



Start with the Right Mindset

Many team managers start on the wrong foot by approaching their team with a wrong mindset. Especially when managing software developers, team managers must realize and keep in mind that they are responsible for highly intelligent, talented individuals with an abundance of creativity and excellent problem-solving skills. 

This fact is a common source of problems for inexperienced team managers, who tend to clash with opinionated team members over trifling issues. In contrast, experience team managers take full advantage of the aptitude of their team by maintaining an open mind and supporting the team’s technical decisions by not enforcing strict requirements. 

As we will explain in the following chapters, the right mindset of a team manager is actually the mindset of a leader. Unlike a manager, whose primary quality is the ability to control or administer, leader empowers and inspires. 

"A leader is at his best when people barely know he exists, when the work is done, the team will say: we did it ourselves. This is because the leader has empowered the team so much and they have become so competent that they are mini-leaders themselves, thus the leader is leading without actually leading them. This must be the highest form of leadership. Lead without leading", paraphrases the Chief Trainer Coach at Asia Coaching Training, Andy Ng, the ancient Chinese military strategist Sun Tzu. 

Provide a Strong Vision Statement

A strong vision statement is a critical component of any great strategic plan. Often confused with a mission statement, which explains why a company exists, a vision statement paints a picture of where a company is going. As such, a vision statement succinctly and clearly articulates an inspiring vision of a future which team members, as well as all stakeholders and even customers, can identify with at the emotional level. 

A strong vision statement is like a powerful inspirational speech, except that it is typically just a few words long. Some of the most powerful vision statements ever conceived talk about a distant yet reachable future. They are also direct, descriptive, and often audacious. They do not talk about revenue projections, shareholders, nor competition. Instead, they envision a common benefit whose importance is immediately clear to everyone regardless of their affiliation with the company the statement represents. 

Experienced and successful team managers understand the power of a strong vision statement to realize the full potential of their team, and they use it to their advantage. We can draw on years’ worth of scientific research in the field of management to illustrate just how effective of a tool a vision statement can be. For example, a 2014 report from TINYpulse surveyed over 500 organizations worldwide and analyzed more than 200,000 anonymous employee responses to answer a simple question: what motivates employees to go the extra mile? 

Perhaps surprisingly, the number one influence in encouraging employees to outperform expectations was camaraderie, the spirit of friendly good-fellowship associated with soldiers, who fight united to overcome seemingly insurmountable obstacles. Furthermore, nearly two-thirds of all employees felt discouraged by the lack of a strong company culture, which is to say by the lack of a strong vision that would motivate employees to strive for excellence.

A team manager must be trained to recognize that their team lacks a purpose and come up with a strong vision statement to provide it. 

Set Constraints

Once team members have a purpose, they need a plan. "This plan, designed by managers, should be set before the actual work begins. Managers should continuously encourage every developer to stick to the established plan", writes CEO & President of UM Technologies and its partner companies Scott Stiner.

In software development, agile provides a proven and tested time-boxed approach based on iterative development, where requirements and solutions evolve through collaboration between self-organizing cross-functional teams, as described by the Agile Alliance. In other words, agile sets constraints as to what gets built in a given time frame and ensures that everyone focuses only on the plan ahead. 

Again, the work of a team manager does not end after a plan has been designed- far from it. A team manager should act as a filter against outside noise. We say "filter" on purpose because a good team manager knows which information they should allow to reach the team and which they should block before it distracts everyone for no good reason.

A team manager's job is to allow their team to be as productive as possible. That means handling non-development work so that everyone else on the team does not have to worry about it. "For example, if managers are coders, they can assist in writing tests to make it easier for developers to continue with their assigned tasks", Stiner explains. 

Foster Healthy Relationships

Each team member should feel that the team manager has his or her back - not that the manager is on his or her back. "Most projects fail because of people, not technology. You are managing a team of people. Humans. Humans are weird, complex creatures", Peter Kelly provides a perspective of a seasoned software engineer with over 15 years of experience under his belt. "Hiring, nurturing, coaching, maintaining and organizing a team of humans and managing that team to success on projects is the main job of a software manager."

All team managers will do themselves and their employees a huge favor if they build relationships based on transparency and trust rather than controlling and reporting. Active listening and proactive communication are two foundational prerequisites for building this kind of relationship. It is easy to fall into the trap of simply walking around the office and asking individual team members how they are doing, but such effort is futile if not supported by genuine desire to establish respectful bonds.

This brings us back to where we started: the importance of having the right mindset. It is impossible for any team manager to foster healthy relationships with team members unless the manager trusts the team members to do their jobs. As Ilja Preuß, a software developer at Disy Informationssysteme GmbH, suggests, "Find out what the developers like to do and find a way to let them do it so that it benefits the company. The most motivated people are those who do what they like doing."

When team members see that the team manager believes in them and goes above and beyond trying to remove every obstacle to allow them to focus on their work, they are much more likely to support the team manager's decisions when the going gets tough, which can be the difference between a team that sticks together and surmounts even the most difficult of challenges and a team that crumbles under the pressure of a deadline, unforeseen complications, or some last-minute change request.

In fact, a study from Office Team examining the impact of appreciation, or lack thereof, in the workplace found that 66 percent of employees would "likely leave their job because they didn’t feel appreciated" and the number jumped to 76 percent among millennials, who are generally known for placing a higher value on healthy company relationships than previous generations. 

The ability to show genuine appreciation and naturally communicate on a personal level is where the difference between average and exceptional team managers shows the most. Exceptional team managers have an innate understanding of the fact that team members who feel respected and valued work harder. They are not only able to praise and recognize a fruitful effort, but they also feel the need to do so because they understand that the recognition of efforts is one of the most important functions of a manager. 

"Think of leadership like computer programming. Dedicate some time to the backend to create a strong team and set them up for success. Come launch time, just step back and admire as they thrive. As a leader, you will know you have successfully built a strong team when they work effortlessly without you. The icing on the cake? Although they don’t rely on you to function, they still want you there", Marketing Strategist Taunya Williams summarizes what the role of team manager should be about. 

Conclusion

The role of team manager requires a broad range of personal competencies, not all of which can be easily acquired from books or experience. The best team managers give their team members a sense of feeling that they are strongly supported, valued, and trusted. They motivate by creating a strong vision statement behind which everyone can rally, and they help their team maintain focus on what really matters by filtering out what does not matter. Perhaps above all, excellent team managers foster healthy relationships with individual team members and understand the value of making someone feel appreciated. 

Sources