pluto

What is quantum computing?

Quantum computing is the next stage in humanity’s computational capability that uses the logic behind quantum-mechanical phenomena to increase our computational speeds to that which the world has never known.

Quantum computing has always been a distant dream to the world’s computer scientists. However, amidst a global race to build ever-powerful machines with faster speeds, today there has been sufficient progress made in creating the world’s first large-scale quantum computer.

Our current computers are based on transistors, the first of which was patented in 1926. These computers store information as “bits”, with each transistor holding the definite state of either a 1 or a 0.

Quantum computing, on the other hand, differs from standard computing systems thanks to the superposition principle. The superposition principle is a behaviour exhibited by subatomic particles such as electrons and photons, the fundamental particles of light. Thanks to this principle, quantum computers can store information as a 1 or a 0 at the same time – in what has been dubbed as a “qubit”.

This means that two qubits can hold four values at once, and, as you expand the number of qubits, the machine becomes exponentially more powerful.

Our understanding of this is limited because quantum computing systems act in ways counterintuitive to how we view the everyday world. Todd Holmdahl, who oversees the quantum project at Microsoft, uses a maze to explain how he envisions quantum computing. “A typical computer will try one path and get blocked and try another and another and another. A quantum computer can try all paths at the same time”.

So what does this mean for us on the ground? The answer is insurmountable. Our understanding of DNA sequencing is directly correlated with our computational speeds, a quantum computer could, in theory, examine every possible scenario of how a drug could interact with the human body, leading to a renaissance in more effective drugs. Quantum computing could also lead to democratised unbreakable cybersecurity barriers (as well as limitless hacking capabilities), ever-more accurate atomic clocks (the technology behind GPS amongst other things) and be able to understand and react to changes in the stock markets at speeds unfathomable in the past.

And that’s not all, at the moment with our current technology it is impossible for us to map the human brain in all of its complexity, and currently, it takes 2 years to map a fly’s brain. Space exploration too will receive a massive update: the billions of pieces of data we currently have to sift through in searching for new planets will become an accomplishable endeavour. The power of quantum computing will also allow us to map proteins like we do with genes, and even map how molecules assemble at an atomic level – possibly paving the way to the creation of new materials.

Currently, the US pledges $200 million annually for quantum computing research. China is now also building a $10 billion research centre for quantum applications. Google, IBM and Microsoft are also set on being the first to build a large one of these paradigm-shifting machines. When they eventually do, the impressive resilience of Moore’s Law will seem like a footnote in technological history.

So what is quantum computing? It’s our ticket to the next step in human innovation, and we’re likely to be given it sometime soon.

Battery electric vehicles are moving into the fast lane

The KPMG global automotive survey each year asks 800 auto executives to rank the eleven most important industry trends. In 2015, Battery Electric Vehicles (BEVs) came in second to last. A year later they jumped to second place, and this year ranked first. BEVs are now seen as the most disruptive trend in the industry, with major automakers pouring billions into development. The Volkswagen Group recently announced that they would invest $84 billion in electric cars and batteries and would offer 300 different models by 2030. So, what has changed that has so dramatically driven the focus from the internal combustion engine to autonomous electric vehicles?

Today, BEVs represent about one percent of U.S. vehicle sales. Why? Because consumers worry about their limited range and the lack of public charging stations. BEVs are also more expensive than conventional cars, and the generous U.S. federal tax credits are only useful if you make a lot of money and pay a lot of taxes. However, new lower-cost models with far greater range are at last becoming available.

The Chevy Bolt, which starts at $37,500 but sometimes has been discounted by as much as $5,000, has a range of 238 miles (383 km) by the strict U.S. EPA combined test cycle. The basic Tesla Model 3 lists for $35,000 with an EPA range of 220 miles (354 km). Tesla also offers an option for a range of 310 miles (499 km), though at a significantly higher price. Given the $7,500 federal tax credit and state tax credits of as much as $5,000, these BEVs are quite competitively priced.

Public availability of charging stations is less of an issue when a BEV’s range exceeds 200 miles. Given the average consumer’s daily miles traveled, home chargers can easily replenish the battery overnight. Fast chargers are, however, important for long trips. That is why Tesla and other manufacturers are building out networks of rapid-recharge stations along major highways and within cities. The 310-mile version of the Tesla Model 3 is reported to add 170 miles of range with 30 minutes of high-speed charging, about the average time it takes Americans to order and choke down a fast-food meal. Tesla has said their eventual goal is a five-minute recharge.

Batteries have long been the largest cost of building an electric vehicle, but are rapidly decreasing in price. In 2010 the cost of battery cells was about $700 a kilowatt hour. The cost this year stands at around $139, a factor of five decrease in seven years. By 2021 batteries are predicted by Tesla and GM to fall below $100.

Electric cars also require very little maintenance compared to traditional gasoline or diesel cars. An internal combustion engine alone typically has over 2,000 moving parts; the entire drivetrain of an electric vehicle has fewer than 25. Electric vehicles don’t even require oil changes or lubrication. Other than tire rotations and cabin air filter changes, the first required maintenance for a Chevy Bolt is changing the battery coolant fluid at 150,000 miles. That is bad news for auto dealers since many receive over a third of their revenue from maintenance.

The future of automotive is not only electric but also autonomous. This technology is rapidly improving while costs continue to tumble. The computers and other sensors required for autonomous operation that cost almost $100,000 eight years ago have fallen below $10,000, and are expected to cost less than $5,000 within five years. LiDAR systems (think laser-based radar) with lots of moving parts are being replaced with solid-state units that cost almost 100 times less. Today’s driverless computers that do five trillion artificial intelligence operations per second will be replaced by units over fifty times as fast in the next three years.

There are still problems to be solved and systems to be exhaustively tested before this technology arrives at your door. These systems are learning to operate in the most difficult weather such as heavy snow and driving rains, and are rapidly improving. They have learned to identify and act on hand gestures from policemen and cyclists, though not those rude gestures you learned at too-early an age. There are several cities where these vehicles are already delivering mobility services, though with safety drivers behind the wheel. Google is reported to have plans to eliminate these human drivers altogether in their Phoenix operations sometime during the coming year. Some vendors are also adding text displays to better communicate with pedestrians and cyclists.

While America’s government may be dominated by climate change deniers, polls demonstrate that the opposite is true for our citizens and for many of our state governments and largest corporations. Due to the shutdown of obsolete coal plants and the rapid growth of renewables for electricity generation, transportation is now the #1 source of U.S. CO2 emissions. Almost all of this comes from burning petroleum in internal combustion engines, which mostly produce heat. While these conventional engines are only 20% efficient, today’s electric motors can convert 90% of renewable electric energy into mechanical motion.

The cost per mile of electricity is a third that of gasoline, and electric ‘fuel’ prices are far more stable. Our planet and our children deserve better.

However, raw economics does not account for the cost in human lives. Each year, 40,000 people die on America’s highways. To put that in perspective, a Boeing 737 aeroplane would have to crash five times a week for a year, killing everyone onboard, to match that horrific death count. America’s highway epidemic is a national disgrace. Another four million Americans suffer serious injuries, many of whom never again live normal lives. Worse still, the worldwide death total exceeds 1,2 million.

About 94 percent of those accidents are due to human errors. We have an antidote to this epidemic: autonomous electric vehicles. They don’t drive drunk, fall asleep, get distracted, and rarely make errors. And they can save hundreds of thousands of lives.

We can do better, and we must do better. We cannot allow the perfect be the assassin of the good. The time for change is now.

The human side of robots

Robots have always been perceived as artificial beings and have faced many limitations in the real world. They have had inexpressive faces, moved too rigidly, and were more alien than human in nature. But then between 2015 and 2016 something happened, a slew of gynoids were presented to the world: Erica, Jia Jia and Audrey Hepburn lookalike Sophia. These robots forced us to redefine what we knew about humanoids. All were human-like AIs, programmed to interact in real time with real people, and capable of sustaining conversations and changing facial expressions according to the tone of the talk. They can all be improved, and undoubtedly will be.

But the most critical, revolutionary and unprecedented goal that science is pursuing is another: to create robots which not only can talk, move and behave like human beings but that also have feelings and emotions. Science is trying to diminish the the main dichotomy between humans and machines: that of emotion – and it’s devoting lots of energy and resources to do it.

Affective computing, also known as Emotion AI, it is a multidisciplinary field which combines engineer and computer science with psychology, neurobiology, cognitive science, neuroscience, sociology, psychophysiology and many others. Professor Picard, founder and director of the Affective Computing Research Group at MIT Media Lab, describes it as: “Computing that relates to, arises from or deliberately influences emotions or other affective phenomena”.

Already by the mid-nineties, programmers and researchers were trying to teach the robots how to recognize human emotions and to interact with people accordingly. Professor Cynthia Breazel and her team enacted experiments with Kismet, a kind of robotic head with moveable eyes, eyelids and lips. Breazel, who is the director of the MIT Medial Laboratory and is considered a pioneer of social robotics, designed a robot who could interact with people and express a limited range of feelings, like sadness, fear, happiness and the need for attention. An erratic or close proximity movement could scare the little electronic head, just like a smile could make it smile too.

In the same university, two computer scientists, Irfan Essa and Alex Pentland, developed computer vision systems that used measurements taken from video for detailed facial modelling and the tracking of facial expressions: it worked well, with each feeling resulting in a facial expression which required a unique combination of muscle movement. The program was able to test if a volunteer cheated by simulating an emotion 98 times out of 100.

Of course, to recognize a facial expression or to mimic one doesn’t necessarily equate to sentience. Another robot, Pepper, was unveiled in 2015 – this small bot was designed and manufactured by the Aldebaran Robotics SA, a French company which, 3 years earlier, had been bought by the Japanese SoftBank. The 4 foot tall Pepper was universally hailed as the first robot to have emotions. It had cameras and sensors that helped it to read body language and understand an individual’s mood. The source of its emotional states were displayed on a tablet-sized screen on its chest.

Human emotions are, in and of themselves, a mystery. Emotions are generated by external as well as internal stimuli and are then revealed by physical signs. Respiratory rate, pulse, gesture and facial expressions change along with our mood.

Another revealing sign, however, is our voice and how we use it. Last year, a group of researchers at the Hong Kong University of Science and Technology announced “the first known system that can recognize a dozen human emotions from tone of speech and in real time”. The “real time” element eliminates the delay in processing caused by the “feature engineering”. As explained by professor Pascale Fung, “emotions in speech are represented by not just the pitch, but the chroma, the tempo, and the speed of that voice. Machine learning needs to first perform feature engineering to extract these characteristics. For tone of voice, feature engineering typically extracts 1000-2500 characteristics from the input audio. This process requires time and caused a delay in the communication between the human and the robot. Now however, with advances in understanding deep learning methods such as the Convolutional Neural Networks, researchers have solved this problem.

Last month, Google announced a new labeled data set of human actions taking place in videos. Their Atomic Visual Actions (or AVA for short) aim to understand “what humans are doing, what might they do next, and what they are trying to achieve”. AVA will teach machines how humans move and operate in the world, with each three second clip (of which there are 57,000) being bundled with a file that outlines the person that the machine learning should be watching, accompanied by a description of their pose and whether or not they are interacting with anyone or anything else.

Once the secrets of our facial expressions and voice’s features are known, robots can be programmed to replicate them. The result is a technology which is not at all human but has the ability of detecting our feelings and responding appropriately. This is particularly valuable for home/hospital care robots – the robot Pepper for example was also engineered to work with patients affected by dementia.

Affective computing is a fast-growing sector. According to a report published last March by the research and consultancy firm MarketsandMarkets, affective computing’s global market is expected to skyrocket in the next few years. In 2016, its value was believe to be around 12.2 billion dollars, by 2021 it will reach some 53.98 billion dollars.

Last February Global IoT solutions provider Hubble presented Hugo, “the first truly intelligent smart camera with personality, featuring Emotion AI video analytics”, powered by the Boston based company. This device can identify family members and ascertain their emotional states and could become a staple household item in the near future.

Affectiva, an emotion recognition software and analysis company, is in a particularly favorable position. This start up specializes in emotion recognition software and has now built a huge database of emotions, possibly the biggest in the world. It comprises of 24,000 ads and over 5 million faces (recorded and analyzed) across 75 countries. Being built on the analysis of so many samples, the emotion-catching software can be incredibly precise and unbiased.

The technology Affectiva masters so well, however, has a potential which goes well beyond the home security systems market or that of the SDKs and APIs sold to developers. Affectiva can effectively detect emotions “leading market research firms like Millward Brown, LRW, Added Value and Unruly, and 1/3 of the Fortune Global 100, including 1,400 brands like Mars, Kellogg’s and CBS, use Affectiva’s Affdex for Market research to optimize content and media spend”. These areas will benefit greatly from these technological developments.

We are still away from having human humanoids, and it is still possible that it may never truly happen. But affective computing is making big gains in driving us into this new realm of technological possibility. Along the way, this AI is going to uplift the capabilities of sectors such as advertising and market research, and, as it improves, more sectors will begin capitalise on this technology in ways we cannot yet imagine. Most importantly however, this technology has the potential to bridge the gap between technology and nature, whether not we should burden these entities with emotions is yet to be seen, but right now developments are beginning to put a bigger emphasis on the human part of humanoid than ever before.

Battle stations: Businesses, governments and cyberwarfare

Menny Barzilay, Founder and CEO of FortyTwo, sits down and lays out the impending dangers that billions of affordable, mass-produced IoT devices will bring to the world of security, and how it is impossible for countries today to continue with the idea that physical borders can protect them from the warfare of the modern world.

Fostering a creative mindset to achieve innovation

Companies built upon the fear of failure are doomed to fail themselves, instead, companies must celebrate curiosity. Fortunately, the solution is clear, and in order to reframe themselves, they need only follow three simple steps: inspire, create and innovate.

Inspire

When was the last time you were inspired? So inspired you did something about it? The truth about inspiration is it can happen anywhere. For me, inspiring others is about getting people excited about a reality that is attainable but hasn’t yet been achieved. In other words, it’s about generating excitement about new ideas and motivating people around the concept of change. Since the behavior of change is most effective when there is a loss, stories that involve overcoming a huge challenge are the ones that tend to be the most inspiring.

Create

Traditionally creativity has been associated with artists, authors and those questioning the world around them. That said, I believe we are all inherently creative, we just have to have a safe environment where we can put our creativity into practice. As children, our home is the safe place that allows us to question the world. Creativity, and the ability to create, start by being a keen observer of one’s surroundings and being insatiably curious. With the rapid ascent of automation in the workplace, I believe any repetitive job that doesn’t have an emotional creative connection is finished.

As children, our home is the safe place that allows us to question the world. Creativity, and the ability to create, start by being insatiably curious. 

Innovate

Google. Apple. Facebook. Amazon. Quite possibly the most innovative companies of our time. It’s their sense of purpose and culture that enable them to “move fast and break things” at scale. These are work environments synonymous with innovation – places where creative destruction has been ingrained into the company culture through the founder’s original vision. They are also environments where the process of innovation thrives based on questioning the status quo, a bias towards action and collaborative teamwork.

The first step, questioning the status quo, is the start of all creative endeavors. In essence, it’s about challenging the current way of doing things and is piqued by one’s own curiosity. This is also the reason why innovation rarely happens outside the comfort zones of corporate culture; change is not welcome in companies that have finely tuned business models that are the bread-and-butter of a company’s existence. That said, as technology continues to be more accessible there will always be better ways of doing things. So, how does one approach a complicated problem with the goal of developing an original solution?

The best way to come up with a novel solution is to break the problem down to its fundamentals/ first principal. Over two thousand years ago, the ancient Greek philosopher Aristotle defined a first principle as “the first basis from which a thing is known.” Reasoning from first principles has been employed by Johannes Gutenberg, Nikola Tesla and Benjamin Franklin – all known by their breakthrough innovations. The most prominent entrepreneur of our times, Elon Musk, has been championing this way of reasoning with nearly every new venture since selling PayPal. Reasoning by first principles is understanding the fundamental truths for a given situation and starting from there.

A first principle is a basic, foundational, self-evident proposition or assumption that cannot be deduced from any other proposition or assumption.

Take for example the presidential election last year – Trump’s campaign was focused on reducing votes for the opposition instead of gaining votes for himself. This was based on data that illustrated Trump would not be able to win more than his hardcore fan’s votes. He did just that, reducing the number of votes for the opposition, to become the 45th President of the United States. If organizations have the ability to question the status quo, they have a better chance of staying relevant, but this is just the beginning.

Once there is an open-ended question that challenges the status quo, there has to be a move to answer it. I call this ‘a bias towards action’ – create a culture and environment where trying things out (with the possibility of failure) is welcomed and encouraged. Innovative organizations are ones that embrace the idea of “trying things out” into their culture. This entails not only freeing up the human capacity to do something but also having the resources and connections to make ideas move from theory to reality.

In the 1950’s 3M had 15% time which resulted in the invention of Post-It notes and masking tape. Gmail, AdSense and Google News all originated from Google’s 20% time. While at Google, I spent part of my 20% time teaching creative skills for innovation to internal teams, clients and non-profits. Another version of empowering people into innovation is Adobe’s Kickbox – a cardboard red box that contains money, a checklist of actions to achieve, learning frameworks, and sugar and caffeine to get the ideas flowing. In the end, addressing innovation is about understanding the power of making things happen outside one’s comfort zone and with as little bureaucracy as possible.

Lastly, innovation is about collaborative teamwork. This can be achieved through an organizational culture that embodies a strong sense of empathy and diversity. A place where risk and failure are not feared but embraced as tools for learning. Creating an environment where everyone feels like they can be themselves is also one of the most difficult steps to achieve as the majority of the companies today are still structured in silos.

In the 1900’s during the Industrial Age, when factory-line work was the only way to produce things at scale, having an organization structured in silos was logical. The challenge with organizations structured in silos is the lack of empathy each group has for one another as people are focused on their own piece of the pie. Innovative organizations understand a diverse group of people, with a strong sense of empathy for one another, are paramount to staying relevant today.

The Fourth Industrial Revolution is upon us: When compared to the previous, the Fourth is evolving at an exponential rather than a linear pace.

These three skills (questioning the status quo, a bias towards action and collaborative teamwork) are the key to developing innovation individually and in organizations. Organizations that foster innovation understand that the work environment has to be a creative environment; a place where people question, make things and collaborate across different disciplines. To introduce teams and companies into the creative process and cultivate their own creativity, my experience has demonstrated that doing is better than saying. Therefore, when I’m not delivering a keynote on the creative leadership I develop and facilitate workshops that focus on making ideas real.

As a tangible experience, workshops are a form of learning that have the ability to make the concepts above resonate. One example of a workshop I developed and facilitate is having people from diverse teams create teams and build autonomous drawing machines. Every time I have facilitated this workshop people get to go and have first-hand experience with each of the steps mentioned earlier: questioning the status quo using first principles, a bias towards action with the given materials in a constrained time limit and collaborative teamwork. From my experience, the most successful teams are the ones that start trying things out right from the start and end up reiterating on their original idea once they have the first version running. It is their creative mindset playfully at work, making things that inspires me to move forward.

This is my current process but it is always evolving, always changing. Organizations that fail to understand change is constant succeed in becoming less and less relevant. And those that are driven by fear instead of curiosity will simply not be able to compete in the future. The organizations that create and nurture a culture of creativity – built on a foundation of empathy, diversity and action – will be ones that stay relevant and shape our world for a better tomorrow.

The rise of “data-first” companies

If you read most tech blogs or publications, chances are you’ll see the terms Machine Learning (ML), IoT (Internet of Things), or Synthetic Biology (SynBio) at least half a dozen times. Though these fields have been around for decades in some shape or form, it is only recently that building applications, products, and services have become affordable enough to begin wide-spread adoption. In this article, I’ll explore a few examples of how we see these new technologies coming into the market and how companies can prepare for and leverage new opportunities that concomitantly arise.

The most tangible examples of new technology invading our daily lives has to be that of connected devices – or the Internet of Things as it’s often called. The massive amounts of data sourced through IoT is presenting incredible opportunities for all industries, and ideological problems for many businesses in general. The ability to communicate with remote machines, devices, and systems has opened up new efficiencies as well as created new business models. One of my favorite examples is how companies that may not have been traditionally seen as technology companies are in fact embracing connectivity solutions to create new value and benefit for themselves and their customers.

For example, companies such as Caterpillar and John Deere have IoT systems in place that monitor their tractors and specific systems/parts to let them know when they need to be changed or to warn of any potential malfunctions. This lets the machine vendors offer servicing and maintenance more efficiently and at a lower cost, since they can predict their upcoming work orders more readily, and also allows for their customers to have less down time and fewer issues with their equipment – prevention is often much cheaper than cure. What’s important to note here is that this is an infrastructure play – the end consumer or user is rarely aware of what’s happening behind the scenes (or literally under the hood in this case).

In addition to digital technologies, the arsenal of biological tools is also expanding at an exponential rate. Synthetic Biology has been receiving its fair share of the spotlight as well. The field of SynBio involves manipulating biological systems to perform various tasks much like the way a computer scientist would program a computer – anything from manipulating DNA using CRISPR to try and create life-saving therapies, to using yeast or bacteria to produce new proteins and materials, to growing meat and other food products in labs.

One of the more common applications in SynBio is to use organisms like yeast or bacteria to act like small factories to produce new molecules. Insulin is probably one of the most well-known drugs that’s made by such a process – the DNA that encodes for the insulin protein is inserted into bacteria that then read this DNA and form the insulin protein. Today, there are many exciting startups using this basic technique to create anything from spider silk (without spiders), milk (without cows), and eggs (without chickens).

While on the surface it may seem that IoT and SynBio are not that closely related, they share a very strong common link – namely that both fields generate and consume massive amount of data. For IoT, this is fairly easy to see – sensors and digital communications between devices. For SynBio, it may be harder to see the role of data, but in fact, I would argue that there is far more data to be dealt with since the foundation of SynBio is understanding the genome of virtually all living creatures. Furthermore, the synthesis and manufacture of biological products involves massive amount of real-time process data to keep quality control during production. The link is more obvious since both fields must rely on data science and computational technologies to help navigate through myriad data sets.

When talking about data science, “Artificial Intelligence” and “Machine Learning” are used often in popular journalism, but what exactly do these terms mean? Artificial intelligence (AI) has generally been used broadly to refer to the ability of computational systems to mimic the apparent cognitive functions of human beings – or, in other words, to be able to learn and understand information similar to how a human does. Machine Learning is a subset of AI that specifically focuses on enabling computers to learn from data without being directly programmed to do so. One of my favorite definitions of ML is that it is really just “automated statistics”. What this means is that from large data sets, you can extract trends and, in the case of a remotely-monitored tractor, see what the likelihood of something breaking down might be, or in the case of genomics, where a particular gene may be located or how it may get expressed. With such large data sets, it’s nonsensical to examine individual data points – the value is in the aggregate picture and that’s exactly what ML excels at – and that’s exactly what is needed to ingest, analyze, and make sense of the large data sets that new fields like IoT and SynBio are generating.

While it’s no surprise that we live in an increasingly data-driven world (recent reports suggest that 90% of the whole world’s data was generated in the last two years!), this nonetheless gives rise to organizational challenges about how companies can handle and leverage this abundance of data. We are in a world where the cost of collecting data is rapidly falling, and the trap that I have seen companies fall into is one where data are collected without a clear strategy for utility. In other words, the sentiment has often been “We can collect all of this data easily, so we will.” The risk is that an organization can quickly become overwhelmed by data and lose sight of its value. The classic examples that come to mind are the first generation of companies in the late 90’s and early 2000’s that combined ML and genomics to build bioinformatics products. That was a classic case of building products on top of rapidly growing data sets without really having a sense of what the true value of the data would be. Unfortunately, most of those companies are no longer around since they fell squarely into the aforementioned data trap.

What we’re seeing today is a new generation of companies that I call “data-first organizations”. What this means is that the company puts data science, analytics and IT at the center of its business and then applies these core skills to certain products and verticals. Amazon is an obvious paragon of such an organization – they analyze and adjust pricing thousands of times per second – creating a pseudo-stock market founded on their data analysis. But here’s the clever part – they are really a data-first company that just happens to be a consumer goods retailer. Understanding their data is so much at the center of their business that they built their own data infrastructure and eventually offered up that capability as a product called AWS (Amazon Web Services).

We’re seeing similar patterns across all industries – the cost of data collection is rapidly falling, and so competing companies must find ways to harness this data in new ways to remain relevant. Circling back to the previous examples, we’re seeing companies like Caterpillar and John Deere transforming into data-first companies that just happen to make heavy machinery. SynBio companies are investing heavily into data science, and they just happen to do biology – companies like Gingko Bioworks, Emerald Cloud Labs, and Transcriptic, to name a few. All of these companies have complex and sophisticated data ingestion, storage, analytics, and visualization capabilities that are the backbone of their operations.

So given all of these great advances being made in how ML and data science is used in different fields, it should be obvious to any organization that bringing a data-first strategy is paramount. Unfortunately, knowing what direction to move in and figuring out how to change organizational inertia to get there are two very different things. More traditional companies have to fight years of company culture and infrastructure impediments, often with limited success.

The biggest impediment to becoming a data-first company is how leadership responds when the collected data contradicts intuition. All too often I’ve seen leaders dismiss data in favor of their own beliefs and biases. While there are certainly times when intuition will trump everything else, anytime data and intuition are at odds is a remarkable opportunity for innovation.

Throughout history, innovation (and indeed invention itself) has happened when data and intuition are at odds. Put another way, it’s when something unexpected happens.

Let’s take a minute to reflect on this: we now live in a world where data is accessible from so many different sources – be it via connected devices or from gene sequences – and we have the tools to analyze this data to find seemingly hidden patterns. We have more opportunities than ever to challenge our intuition and beliefs – to observe the unexpected. By embracing a data-first approach, companies have a much greater opportunity to constantly challenge their inherent biases, beliefs, and intuition with constantly updated data from their operations. Indeed, this is why companies like Amazon and Caterpillar remain competitive – they share this common thread of challenging their beliefs with data. It’s the only way to stay competitive.

Data doesn’t lie.

What automakers need to do to thrive in the future of mobility

Some companies have been able to survive the technology transitions, with others even having emerged stronger as the industry was consolidated. However many players that were thriving in the past are gone today, and every one of these technology transitions sees new players emerge – players who were able to embrace the new technology without having to conserve legacy architectures, maintain current systems or protect existing business models.

Remember when the iPhone came to the market with its iOS operating system? iOS was built from the ground up for a new paradigm of touch interfaces and purpose-built apps. And it was architected to eventually empower a huge ecosystem of app developers. After being in denial for a while, Microsoft rewrote their mobile platform twice after trying to innovate on their existing platform. They even ended up buying Nokia. Today they are irrelevant in smartphone operating systems and the Nokia acquisition is written off. Market leaders like Blackberry, Nokia, and Motorola were wiped off the market and no longer play a role. Today the iPhone business alone is bigger than all of Microsoft.

According to Gartner, Nokia market share in 2007 was a dominant 49,4%. Four years later in 2013 – it was 3%. During the same year, Microsoft bought Nokia for $7,2 billion and in 2015 the acquisition was shut down.

A similar shift happened with Cloud Computing and “Software as a Service” (SaaS). I remember the struggles at SAP, Microsoft, and Infor to try to re-architect technologies and applications for the cloud and for a world of SaaS.

The incumbents ignored the shift as long as they could. They tried to protect their existing business models and technology investments. Much like in the car industry today, the software that was written for the world of on-premise systems did not fit the new world of cloud computing. Today players such as Salesforce.com and Amazon dominate the cloud.

It is difficult to embrace a new paradigm with legacy technology. More so than ever, technology becomes critical in delivering new business models. When I look at the Automotive industry today, I see a lot of parallels. The industry is facing disruptive change on multiple fronts: from connectivity to driverless cars, electric drivetrains and Mobility as a Service (MaaS). The more I learn about existing vehicle architectures and the ECU mess in today’s vehicles, the more I am convinced that we cannot build the future of mobility on yesterday’s technology.

Maybe the most transformational change will come with autonomous vehicles. The race towards robotic cars is on. Today, the industry is making rapid progress towards a future of automated and ultimately full autonomous cars. I think most of the folks who work on robotic cars daily would agree that despite the progress, there are still a lot of potholes that need to be avoided. At the same time, the race is intensifying. Disruptors like UBER desperately need robotic vehicles to make their business models work and to justify their high valuation.

Some of the leading OEMs are pushing hard to make autonomous cars a reality, and are undergoing a transformation into Mobility Services providers. They realize that future business models will be different from today. The engineering teams at these OEMs are under increasing pressure to deliver faster and faster. But existing legacy architectures are an impediment to achieving agility – like coming with that proverbial knife to a gun fight. You cannot compete at the speed of digital on a legacy architecture.

In the old world OEMs have relied on their strategic Tier 1 partners for innovation. But in the process, they have become too dependent on these suppliers, who themselves struggle in this new world. Existing players seek to extend the life of their existing platforms. Like we have seen with Microsoft, it is very hard to face reality and discard the legacy to start from scratch.

Legacy platforms have not been built for a world of connected vehicles. Tesla’s “Over the Air” updates have already proven to be disruptive to incumbents. As we move into a world of autonomous fleets, shared mobility and highly personalized connected vehicle services, data will be at the core of emerging business models. Using technology to turn data into services that can be monetized profitably is the name of the game.

But data is not only valuable. It can also be a liability. Connected vehicles will present a major cybersecurity and brand risk. You cannot duct tape a fragmented infrastructure and make it secure. Acquiring cybersecurity startups will not fix a broken architecture. You need to design with security in mind from the starting line.

When you talk to innovation leaders in Automotive, you can sense something and clearly hear the frustration. The passion of these people to innovate clashes with the corporate culture, existing processes and also the legacy technology that keeps engineering teams from innovating faster.

The systems that are being tested on the road today are still ways away from being production ready. We need platforms that are ready for series production. We can’t fill an entire trunk of a car with hardware. We need to make it cost-effective, and energy efficiency and of course fully Automotive compliant.

New Mobility introduces a new paradigm. Connected car platforms will be the brains of these vehicles. Microsoft did not lose its position because a competitor was able to do better than what they were doing. Companies such as Google and Apple, Facebook, Amazon fundamentally changed the way that the game is played. I am quite certain nobody will be better at building cars at the same scale, in consistent quality and low cost than today’s car companies.

Some of the big OEMs that have the financial means to do so are taking more development in-house. Most of the OEMs who can afford to do it bring a lot of the engineering of software and hardware systems back in-house in order to achieve the level of control and agility that they need. But the OEMs of today are not tech companies. They need platforms that help them focus on delivering solutions. Make their developers more productive to compete on the scale of tech giants. All whilst making them ship products faster and faster. And help them build for the best possible security.

Even if OEMs will be able to master these software and hardware engineering problems, the real challenge will be around business model innovation. Just like Microsoft, automakers will struggle with the fact that the products that helped them create global empires are now merely platforms for new mobility services.

The further we get on this journey towards autonomous vehicles, the more apparent the deficiencies of legacy architectures will become. Just like in the world of mobile computing or the shift to the cloud, those who try to take their legacy into the future will fall further and further behind. OEMs need to take control of technology platforms from Tiers 1 and stay independent of the tech giants. They cannot afford to lose control over the customer experience.

I think there is an opportunity for new players to build a next-generation connected vehicle platform that empowers OEMs in their battle with the tech giants and the new emerging players. That said, creating a new platform from scratch is a challenging. Software is one thing. Hardware is a harder. Building a platform that is truly strategic in the future of mobility, and at the same time meets the regulatory and compliance requirements in Automotive is a real challenge. Can it succeed? Will a new player emerge or will the incumbent be able to protect their turf? We shall see.

What is Ransomware?

Being held ransom is not something new, since the dawn of man various members of rival groups were captured and traded for ransom. Today, thanks to advances in technology, the rise of the internet and the interconnectivity of networks the world over, to be held ransom can mean something very different.

Ransomware is the world’s fastest growing form of computer virus. Every 40 seconds, a ransomware attack is attempted on a company, and 71% of the time, they are successful. Ransomware entails holding an individual’s data hostage – with either the intent to block access to it permanently, or to share it for all to see (whichever is worse). With this hanging over an individual, those holding the data hostage will do so until the ransom is paid. Global ransomware damage costs (which include time wasted dealing with their aftermath) have been predicted to exceed $5 billion in 2017, up from $325 million in 2015.

The more refined Ransomware attacks involve cryptoviral extortion: encrypting the victim’s files, making them inaccessible, and demanding a ransom payment to decrypt them. Without a decryption key, it is almost impossible to solve without paying the attackers. This, coupled with the fact that they use difficult to trace digital currencies for the means of payment, means that a ransomware attacker can be very difficult to identify and prosecute.

This malicious software is now making headlines across the world and is dominating the security industry. No industry is immune: 15% or more of businesses in the top 10 industry sectors have been attacked, and IT services provider Intermedia have reported that, during the past year, 48% of IT consultants have seen increases in ransomware-related support inquiries across customers in 22 different industries.

According to the Cisco 2017 Annual Cybersecurity Report, this new business model is growing at a yearly rate of 350%. Traditional security solutions are simply struggling to keep up with the incredible pace at which new ransomware variants are being produced. It might sound like a unsurmountable situation – but with a bit of awareness, you might be able avoid your business being their next victim. One in five cases involving significant data loss came about through employee carelessness or lack of awareness.

So what is ransomware? It’s one of the biggest digital threats facing businesses today, but after reading this, it might have just gotten a little bit smaller.

A journey to the Jarvis era

Two or three years ago, something happened. Google released their Google Glasses, a device not aimed at being primarily classed as AR glasses but more focussed on contextual content. In the years that followed, AR companies would build on this experiment and today we have the big five (Apple, Microsoft, Facebook, Samsung and Google) all making great gains in the development of viable AR technology. But it was the Google Glasses that really opened up the market, and we have them to thank for how enablers since have added a lot of content to this initial hardware.

Today we are at the beginning of the end of reduced reality. We currently live in a multi-dimensional world, but the way we learn and operate is still stuck in 2D, relying still on printed manuals, books and computer screens for our information. For the past 15 years, we have searched for information and knowledge in the same way, through 2D screens and search engines.

But as Walt Mossberg, the pioneering technology journalist, wrote earlier this year in final column piece The Disappearing Computer these 2D mediums will soon begin to disappear. The rise of voice controlled home hubs is a testament to this, taking us from the search engine era and flinging us into the Jarvis era. The Jarvis era is when humans have access to knowledge through natural interfaces and information is available contextually.

As we have seen with the success of Pokemon Go! and Snap’s Spectacles, AR technology is in a way undergoing reverse technological determinism – it is adapting to society and not the other way around. Widespread consumer acceptance enables the spread of technology into the Enterprise sector, leading it to become more normalised and familiar for workers and consumers alike.

Which is undoubtedly a good thing as currently employees are wasting a lot of time searching. Daily, an employee spends 1.8 hours searching and gathering information. This translates into 9 hours per week, and on average, workers tend to take up to 8 searches to find the right information, each search lasting between 5 and 25 minutes. So there is a huge incentive for companies to get onboard with this sooner rather than later. By ensuring that information is continuously and immediately available, the Enterprise industry will be able to revolutionize how its workers learn and work.

Reality+ (Virtual Reality, Augmented Reality & Mixed Reality) on the tip of revolutionising how industries, workers and societies function. Expect huge advancements in how we learn and a departure from the normal search bar learning method we have used for the past 15 years. Instead, a method of learning that capitalises on a world saturated with on-demand information that is also seamlessly embedded into our real environment awaits.