by admin admin No Comments

The past 18 months have been full of unexpected technological advances, as the whole world shifted to meet the demands of a new reality. But are these innovations just a product of the pandemic or here to stay? Here are five technology trends that experts believe won’t disappear with lockdowns.

Back in 2018, Erik Ekudden, Ericsson Senior Vice President and Group CTO listed ‘the realization of zero touch’ as his number one future tech trend to watch. Little did we know just how true his words would prove to be – though not quite in the way he may have thought. The global pandemic forced people and businesses alike to rapidly adapt to a new reality. One that overwhelmed health systems and perhaps most importantly, isolated us from normal life.

But over the past year we’ve also seen technological innovation step up to fill these voids.

Advances in fields such as artificial intelligence, e-commerce and the Internet of Things were already well established on the tech trends radar. What we didn’t expect was that fields such as education and healthcare, among the most conservative in the adoption of new technology, would suddenly take center stage – and progress in a matter of months in ways that would usually take years. Of course, we’ve understood the importance of digital connectivity for decades. But we never predicted it would become the center point for our everyday lives almost overnight.

Today as vaccines roll out and many of us eagerly await the return of hugs and gossip over the office watercooler, I wonder – which of these new technologies will stand the test of time? Here are five of the top technology trends from 2021 that experts believe are likely to stick around for years to come.

Trend 1 – Digital workplaces
By the end of June 2020, 42 percent of the United States labor force was working from home full-time. While we scrambled to find the best ways to work remotely, collaboration software boomed. In 2020, the global video conferencing market reached USD7.87 billion – more than double the previous year.

Ericsson Wired Landing page video Future of work

In general, employees have responded positively to the convenience of ‘WFH life’, but employers are also noticing the benefits – lower office rental and upkeep costs, for example. According to our Future of Enterprises report, 60 percent of decision makers are very satisfied with the ability to cut down on office space, with 43 percent strongly believing they will have no office at all by 2030. Early indications also show remote workers are up to 40 percent more productive than their in-office counterparts.

According to the global survey featured in our IndustryLab report exploring the dematerialized office and insights into the 2030 future workplace, half of respondents indicated they would want a full-sense virtual presence at work from anywhere. Imagine digital workspaces where you can wave to your colleague across the room, hand over an important document or even share coffee and cake (complete with tempting digital aromas and tastes) without even leaving home – or even a favorite getaway location.

With many of the tech giants out there, including Twitter and Facebook, announcing their plans for more permanent working from home arrangements post-COVID, it’s generally agreed that the future of work is remote, and that ‘business as usual’ will never be as it once was.

Trend 2 – Online learning
Digital workspaces and dematerialization won’t just benefit those in the workforce. At the peak of the COVID pandemic, over 1.6 billion children in 195 countries around the globe were sent home as classrooms closed.

As well as video conferencing tools, other digital services such as language learning apps, virtual tutoring and e-learning software have all seen huge surges in demand. At the same time, initiatives like Keep America Running have shown just how quickly our society can connect – both digitally and empathetically – for a common cause, like giving more students without an internet connection access to remote learning and narrowing the educational divide.

With quality education key to both the United Nations Sustainable Development Goals and the Human Development Index (HDI), there’s no question that education must be well-resourced and accessible to all.

According to OECD, 95 percent of students in Switzerland, Norway, and Austria have a computer to use for their schoolwork, compared to only 34 percent in Indonesia. And in the US, virtually all 15-year-olds from a privileged background said they had a computer to work on, while nearly a quarter of those from disadvantaged backgrounds did not.

As we continue the important work to improve educational opportunities through technology, we need to ensure we’re reducing, and not contributing to, inequality in education.

While the extent to which e-learning continues once students return to their classrooms is yet to be seen, the necessity of connectivity for education has been made abundantly clear. And as 5G networks enable faster internet and more reliable connectivity than ever before – even in remote locations – these possibilities will only continue to grow.

Trend 3 – Telehealth
The healthcare industry has traditionally been one of the most resistant when it comes to IT and digital technology uptake. However, the COVID-19 pandemic showed the huge potential, and real-world functionality, of telehealth technologies as vital tools to help avoid the spread of viruses through tracking, testing and treating.

In a research innovation project launched in September 2020, Ericsson, Telia and Sahlgrenska University Hospital in Sweden used AI to help monitor and manage the demand on healthcare resources, creating and refining advanced AI analysis and insight models for the planning and prediction of healthcare resources.

Ericsson, University Hospital Birmingham NHS Foundation Trust (UHB) and King’s College London also collaborated on the 5G Connected Ambulance – a groundbreaking new way to connect patients, ambulance workers and remote medical experts in real time. This innovation enabled healthcare workers to perform the UK’s first remote diagnostic procedure over 5G, demonstrating its transformative potential to enable clinicians and paramedics to collaborate haptically, even when they are miles apart – and help patients even if they can’t get access to a hospital.

Telehealth also provided other game-changing ways to address the challenges of providing health services at home, through video conferencing, e-mail, telephone, or smartphone apps.

These advances have been particularly helpful for seniors. Recent insights from an Ericsson ConsumerLab study revealed that devices and the internet had helped 90 percent of seniors surveyed during the pandemic. The benefits offered by technology aren’t limited to medical services either, but can be factors that can improve overall quality of life through mobility, safety and socialization.

A 2020 study also concluded that the COVID-19 pandemic had forced important changes in the healthcare industry which may help to establish telehealth more firmly in the years to come. This will be a vital step in building trust and technological literacy for the revolutionary innovations set to transform the future of medicine.

Trend 4 – Contactless convenience
Contactless technology is defining the customer experience post-COVID, from touch-free payments and ‘just walk out’ shopping to biometric check-in for travel and accommodation.

Even when shopping in-store, almost 90 percent of shoppers in the US now claim to prefer touchless or self-checkout features. And with security always a high priority in an increasingly globalized world, facial recognition security systems are becoming more and more common.

These safe and undeniably convenient innovations have been made possible with more advanced processors and memory chips, better image sensors, smarter AI and faster communications networks, all of which will continue to improve in the coming years.

Combine this with the expectation that virtual and augmented reality will fundamentally change our everyday lives in areas including education, work, social interaction, travel and retail, and we’ll start to see a true blend of the physical and digital – no touching required.

Trend 5 – AI-generated content
Machine learning innovations like Generative Adversarial Networks (GANs) have caught a lot of attention in recent years, often thanks to mostly shocking, sometimes hilarious celebrity deep fakes. These clever generative models use training data to ‘learn’ patterns to generate new data that resembles the original input. An example of the output is the ability to create digital images that look like photos of real people, in the case of the rather unsettling website This Person Does Not Exist.

Now you can stare into the eyes of a stranger who doesn’t exist. These are artificially generated images of human faces via thispersondoesnotexist.com.
Now you can stare into the eyes of a stranger who doesn’t exist. These are artificially generated images of human faces via thispersondoesnotexist.com.

This technology, mostly used for entertainment and filter apps, is learning as it goes, and is being fed more and more of our history, stories and personal information. We already know that 50 percent of us are uncomfortable with not being able to tell the difference between human and machine, a concern raised back in 2018 in our 10 Hot Consumer Trends (see trend six on uncanny communication). But where will it end?

One possibility is that artificial intelligence may become more and more powerful over time due to the changing nature of our media consumption.

But how can AI match the human touch when it comes to creativity? Well, the short paragraph above was written by an AI text generator. Did you notice a difference?

Whose line is it anyway?
The Associated Press has been using AI to report on Minor League Baseball for years, and last year the Guardian went one step further, publishing an op-ed article authored entirely (and rather entertainingly) by OpenAI’s language generator GPT-3. Language generators are now so good, even researchers are struggling to find the difference, admitting even the famous Turing Test is no longer sufficient to tell man from machine.

“[As] more language models are made… it’s going to be harder to figure out if a machine generated an article,” said Adaku Uchendu, doctoral student at the Penn State College of Information Sciences and Technology where the research was conducted. “So, we have to improve our detection models even further.”

But what will it mean to be accused of being an AI? If we’re all using AI to author our text in the future, is there any point in making distinctions? Just look at how predictive text on our devices already suggests what to write next. If the future author is a form of human-machine symbiosis, it’ll be very hard to differentiate that from pure AI-generated text.

“The ultimate goal of this work would be to enforce some kind of disclaimer on an article that states that it is machine generated,” Uchendu continues. “If people are aware, then they can do more fact checking when an article is machine generated.”

Ericsson Principal Researcher, Rebecka Cedering Ångström, introduces another angle, raising the deeper questions and considerations we’ll need to make in order to live harmoniously with AI technology in the future.

“What makes us think machine writing needs greater fact checking? Humans are just as biased – they can be wrong and often write with an underlying agenda. We should focus less on who is writing, and more on who is sending the message. Or take it a step further – who will we allow to make the greatest mistakes? Are we more likely to forgive a falsely written article by a machine or human? Or what about a missed diagnosis from an AI or a human doctor? These are important questions we, as a society, need to consider.”

The debate will no doubt rage on over whether the future of content is human or machine, but as the technology improves, particularly for video and image generation, we’re likely to see more AI-generated content out there – whether we’re aware of it or not.

Disclaimer: This blog post was not written by a machine (except where indicated) – human imperfection is solely to blame for any flaws or errors.

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *