It would be reasonable to say that Artificial Intelligence (AI) has revolutionized how we live in the last decade. If you try to remember a time before Alexa and smartphone, it seems like a distant memory. However, considering Facebook was only formed in 2004, the first iPhone was released in 2007, Instagram set up in 2010, and Alexa entered our lives in 2014, it is clear that a huge amount has been achieved in a short space of time.
AI platforms and devices have been able to proliferate the market for several key reasons. First, computing power has advanced to a level where it is suitable for scalable technology. Second, the rise of digital channels and social media platforms has filled the world with highly valuable data. We now also have a large pool of resources with the skills to manipulate and deploy that data into products and systems.
Combing those factors has meant spending on research, development and deployment of AI continues to rise. Although there are ongoing debates around the ethics and social implications of solutions, innovation-driven by AI is infiltrating all aspects of our day-to-day lives.
In 2020, it is impossible to imagine a life without AI. Throughout the history of the field, there have been times where investment has slumped, leading to “AI Winters.” With the dependence we have on AI solutions today, the same will simply never happen again.
In this article, we will look at emerging trends impacting the remainder of the year and beyond.
Machine and Deep Learning
Machine learning is an application of AI that has been around for some time now. It is the art of using data to make decisions based on predicted outcomes. An example might be how Amazon recommends products to customers. Deep learning is taking these methods to a new level with techniques like computer vision in autonomous vehicles and content creation. These extensions of AI are allowing it to be cognitive and think as humans do. If we ever want to have a commercialised stream of driverless vehicles, machines will need to develop real-world concepts.
This application of deep learning focuses on techniques that allow machines to form a deeper understanding of digital images, such as photos or videos. While machines may have been able to recognise content for a while now i.e. the difference between a cat and a dog; they don’t necessarily understand it.
There are many industry examples with autonomous cars often being cited as being reliant on computer vision. However, in healthcare, the method can assist with surgery by powering robots through image data. This gives the potential for remote procedures instead of having to rely on an expert human resource that isn’t readily available.
Although it always sounds a little bit mythical, cloud technology has been a significant factor in the continued progression of AI and machine learning. Six very well-known leaders have invested heavily in cloud technology, namely Amazon, Alibaba, Google, IBM, Microsoft and Oracle. The scalability and processing power available via the cloud has sparked the rise of IoT devices, where businesses are now able to analyse their data at speed.
Cloud technology comes at a more affordable price tag. Small businesses can now develop scalable solutions as they attempt to keep up with larger enterprises.
IoT at the “Edge”
One of the limitations of IoT devices has always been latency. For example, if sensors are being used in a factory, data needs to be communicated in real-time for any data-driven functions to be efficient. While cloud computing has made this better, edge computing is turning it into a functioning reality.
This technology brings computation and data storage closer to the location it is needed, meaning response times improve and there is a saving on bandwidth. For example, AI chips such as those that are built into Amazon Alexa can do more processing on your local device and rely less on the cloud itself. The cloud database can focus more on storage and data backups. The cost of the server and data storage reduces, and customers get a faster response. Edge computing is a win/win situation.
5G will facilitate edge computing with low-latency and high-capacity. This allows for AI-enhanced experiences that are in massive demand from the consumer. For example, augmented and virtual reality can work faster and better, while voice assistants become ever more responsive. Imagine walking into a shop and seeing a similar setup to a digital experience. Perhaps you can swipe through a product catalog on a wall or scan items with your smartphone to get instant information and reviews. 5G and edge computing make this possible.
Automated machine learning to remove resource gaps
Some industries have struggled to keep up with developments in AI and machine learning but it isn’t always because they don’t want to. Often, there is just a lack of skills and technology and they cannot find a way of filling the gaps. Automated machine learning (AutoML) helps to define the most appropriate models and frameworks to use for a task. This means skilled resource has more time for the analysis rather than having to go through the process of training models.
Many businesses are investing in AutoML and realizing that they can have far more productive resources who stay focused on the problem instead of getting too engrained in processes and workflows.
Internet of Things (IoT)
Most of us now own a smart home device. Post Covid-19, as a result of social distancing becoming a norm, there will likely be a growth in such devices. For example, smart refrigerators will be able to stake stock of their contents and automatically order groceries. There will need to be rules and governance in place to stop it from ordering for fun, but it solves the problem that consumers expect to have in the foreseeable future.
With a low latency 5G network, companies can also easily upgrade IoT devices with new software. In time, that could mitigate bugs entirely.
Remote Sensors and Monitoring
As with IoT, Covid-19 has accelerated the need for businesses to invest in remote tracking capability. For example, during lockdown, if nobody can visit or inspect a factory, how do you ensure it is still running smoothly or if it can operate at all? In the remainder of 2020 and into 2021, there will be a surge in the production of remote sensors to negate the problem of monitoring. Sensors that link to a central system can help businesses continue to operate unattended and even reduce resource costs where applicable.
Data and personalisation
Providers of goods and services can now get an accurate view of their customers in real-time when they interact with them digitally. As consumers use more digital platforms and apps, companies learn more about them, allowing for fast and accurate customisation. For example, the images displayed on a website can change instantly, depending on the person viewing it. When scrolling through Facebook, we have all thought about how the ads seem to know what we are thinking about. The vast amount of data can now offer these real-time personalisations.
Businesses always need to focus on the customer experience. Some have branded 2019 as the year of the chatbot, and that trend has continued into 2020. As these virtual agents gather more data and expertise, they are now at a point where it is difficult to know whether we are speaking with a human or a bot in some cases.
When Amazon first released Alexa, there were many questions that the bot could not answer as it did not have the experience and training data to do so. However, fast forward six years, and Alexa now works with unsupervised machine learning models. Essentially, we now have algorithms that are able to infer their own meaning from instructions given to them rather than relying on a rules-based approach. As more chatbots adopt a similar approach, the lines between humans and robots will become blurry.
AI and its applications are not going away and will cause a significant amount of change to everyday’s life over the next decade. While there has been a lot of buzz in the past that has not been fulfilled, advances in skills, computing power, and modelling are ensuring that the hype is finally being realised.
The recent Covid-19 pandemic has put AI development into overdrive in some industries. Businesses have felt the high impact of being closed down for a period, showing the importance of AI in automating tasks.
The scary thing about AI is that we don’t really know what it is capable of yet. Research into more powerful tools like quantum computing are still in their infancy but are set to continue to change the field dramatically.