URGENCY: Which emerging tecknologies will evolve into necessity or new solutions for your business? (Foto: Istock)

Emerging innovations that could redefine IT

CIOs must keep an eye on the horizon. The following forward-thinking strategies and technologies are starting to gain traction and could impact the next decade of IT.

The pace of innovation is relentless. CIOs must watch for the next generation of emerging technologies because new software can go from the dreams of some clever coder to an essential part of every IT shop in the blink of an eye.

Once wild and seemingly impossible notions such as large language models, machine learning, and natural language processing have gone from the labs to the front lines. The next generation promises to deliver the same unstoppable parade of innovation.

Evolving technologies are starting to gather momentum today. Embracing an idea before it’s ready can be invigorating – if you’re right. Waiting until it’s established may be safer but can put you behind your competitors.

Analog computing

The most common paradigm for computation has been digital hardware built of transistors that have two states: on and off. Now some AI architects are eyeing the long-forgotten model of analog computation where values are expressed as voltages or currents. Instead of just two states, these can have almost an infinite number of values, or at least as much as the precision of the system can measure accurately.

The fascination in the idea comes from the observation that AI models don’t need the same kind of precision as, say, bank ledgers. If some of the billions of parameters in a model drift by 1%, 10% or even more, the others will compensate and the model will often still be just as accurate overall.

AI models don’t need the same kind of precision as, say, bank ledgers.

The hope is that new analog chips will use dramatically less power making them useful for any of the mobile and distributed applications on machines that aren’t always plugged in.

The main constituency: Any developer who will happily trade a bit of precision for a big savings in electricity.

Chance of succeeding: The success or failure will probably be governed by the nature of the applications. If there’s less of a need for absolute precision and more of a need to save electricity, the chips will be more likely to win.

Physical security of digital systems

When most IT people think of computer security, they think of clever hackers who infiltrate their systems through the internet. They worry about protecting the digital data that’s stored in databases, networks, or servers. The world of locking doors and protecting physical access is left to locksmiths, carpenters, and construction managers.

But physical security is becoming a real worry and IT managers can’t take it for granted. The best example of how physical and cybersecurity are merging may be the car thieves who’ve figured out that they can pry open a seam by the headlight, connect to the data bus, and inject the right message to open the doors and start the engine. The Death Star wasn’t the only technical marvel brought down by an unguarded physical port.

Similar attacks on the hardware found in desktops or laptops is starting to hit closer to home for IT departments. Building devices that are secure against both digital and physical assaults is very hard.

Main constituents: The most shocked are often companies that fall victim to a poorly guarded physical attack, but everyone needs to consider the dangers.

Chance of succeeding: Basic physical security is as easy as locking your doors. Real physical security may be impossible. IT departments must find the practical balance that works for their data and, at the very least, up their game to defeat the new generation of attackers.

Reliable computing

Trustworthy systems have always been the goal for developers but lately some high-profile events are convincing some IT managers that better architectures and practices are necessary. They know that many software developers fall into the trap of watching their code run perfectly on their desktop and assuming it will always be so in the real world. A number of high-profile software failures at companies like Southwest Airlines or EasyJet show how code that runs well most of the time can also fail spectacularly.

The challenge for IT teams is trying to anticipate these problems and build yet another layer of resiliency into their code. Some systems such as databases are designed to offer high reliability. Now developers need to take this to the next level by adding even better protections. Some of the newer architectures such as microservices and serverless designs offer better protections but often come with troubles of their own.

Developers are re-evaluating their microservice and massive monolayers with an eye toward understanding how and when they collapse.

Main constituency: Businesses like airlines that can’t live without their technology.

Chance of succeeding: Companies that are able to improve the reliability and avoid even the occasional mishaps and catastrophes will be the ones that live on. Those that don’t will be bled dry by the lost contracts and opportunities.

Web assembly

Much of the past several decades of software development has been devoted to duplicating the ease and speed of native desktop code inside the security straightjackets of the modern web browser.

The results have generally been good but they’re about to get better thanks to the emergence of web assembly (WASM). The technology opens up the opportunity for developers to write more complex code that offers more sophisticated and flexible interfaces to the user. Sophisticated tools like photo editors and more immersive environments become possible.

The technology also opens up the options for more complex, compute-heavy code with more sophisticated AI models and better, more responsive code. Tools like CheerpJ, Wasmer, and Cobweb are just three examples of tools bringing languages like Java, Python, and COBOL to the world that was once the land of JavaScript.

Sophisticated tools like photo editors and more immersive environments become possible.

Main constituency: Teams that must deliver complex, reactive code to distant users. If much of the work is done at the client machine, then web assembly can often speed up that layer. Managers who want to ensure that all the hardware runs the same code will find the simplicity attractive.

Chance of succeeding: The foundation is here. The deeper problem is building out the compilers and distribution mechanisms to put the running code in people’s machines. The biggest challenge may be that downloading and installing executable code is not that hard for many users.

Decentralized identity

The idea of splitting up our so-called identity is evolving on two levels. On one, privacy advocates are building clever algorithms that reveal just enough information to pass through whatever identity check while keeping everything else about a person secret.

Another version seems to be evolving in reverse as the advertising industry looks for ways to stitch together all our various pseudonyms and semi-anonymous browsing on the web. If you go to a catalog store to shop for umbrellas and then start seeing ads for umbrellas on news sites, you can see how this is unfolding. Even if you don’t log in, even if you delete your cookies, these clever teams are finding ways to track us everywhere.

Main constituents: Enterprises like medical care or banking that deal with personal information and crime.

Chance of succeeding: The basic algorithms work well; the challenge is social resistance.

GPUs

Graphic processing units were first developed to speed up rendering complex visual scenes but lately developers have been discovering that the chips can also accelerate algorithms that have nothing to do with games or 3D worlds. Some physicists have been using GPUs for complex simulations for some time. Some AI developers have deployed them to churn through training sets.

Now, developers are starting to explore speeding up more common tasks such as database searching using GPUs. The chips shine when the same tasks need to be done at the same time to vast amounts of data in parallel. When the problem is right, they can speed up jobs by 10 to 1,000 times. Not only that, but companies like Apple and AMD are doing a great job integrating the GPU with the CPU to produce something that can do both types of tasks well.

Main constituents: Data-driven enterprises that want to explore computation-heavy challenges such as AI or complex analytics.

Chance of succeeding: Smart programmers have been tapping GPUs for years for special projects. Now they’re unlocking the potential in projects that touch on problems faced by a larger array of businesses.

Green computing

Every day we hear new stories about huge new data centers filled with massive computers that are powering the cloud and unlocking the power of incredibly complicated algorithms and artificial intelligence applications. After the feeling of awe dissipates, two types of people cringe: the CFOs who must pay the electricity bill, and green advocates who worry about what this is doing to the environment. Both groups have one goal in common: reducing the amount of electricity used to create the magic.

It turns out that many algorithms have room for improvement and this is driving the push for green computing. Does that machine learning algorithm really need to study one terabyte of historical data or could it get the same results with several hundred gigabytes. Or maybe just ten or five or one? The new goal for algorithm designers is to generate the same awe with much less electricity, thus saving money and maybe even the planet.

It turns out that many algorithms have room for improvement and this is driving the push for green computing.

Main constituents: Any entity that cares about the environment — or pays a utility bill.

Chance of succeeding: Programmers have been sheltered from the true cost of running their code by Moore’s Law. There’s plenty of room for better code that will save electricity.

Decentralized finance

Some call it a blockchain. Others prefer the more mundane phrase “distributed ledger.” Either way, the challenge is to create a shared version of the truth – even when everyone doesn’t get along. This “truth” evolves as everyone adds events or transactions to the shared distributed list. Cryptocurrencies, which rely heavily on such mathematically guaranteed lists to track who owns the various virtual coins, have made the idea famous, but there’s no reason to believe decentralized approaches like this need to be limited to just currency.

Decentralized finance is one such possibility, and its potential rides in part because it would involve several companies or groups that need to cooperate even though they don’t really trust each other. The chain of transactions held in the distributed ledger might track insurance payments, car purchases, or any number of assets. As long as all parties agree to the ledger as truth, the individual transactions can be guaranteed.

There also continues to be real interest in Non-fungible Transactions (NFT), even though the hype has faded. These can end up having practical value to any business that wants to add a layer of authenticity to a digital experience.

Main constituency: Anybody who needs to both trust and verify their work with another company or entity. Enterprises working with digital elements that need more authenticity and, perhaps, artificial scarcity.

Chance of succeeding: It’s already here but only in cryptocurrency worlds. More conservative companies are slowly following.

************************************

Oppovertekst: Innovations

Bilder kan godt plasseres innunder hver mellomtittel som er markert her.

Analog computing Istock 1134867308. Bildetekst "COMPUTING: No longer just on or off"

Web assembly Istock 1235601598. Bildetekst "WEB ASSEMBLY:  Opens up the opportunity for developers to write more complex code"

GPUs Istock 1418482052 Bildetekst "GPU:  Explore speeding up more common tasks such as database searching using GPUs"