IOCI understands that developing or improving leadership skills, refining your career goals and growing into a servant leader takes time and ongoing training. That’s why we offer learning academies and other leadership development resources through our Learning Center.
This is an excerpt from IOCI’s white paper, Leading in the Era of AI: What Organizational Leaders Must Know to Thrive. To read more, you can download the full copy from our IOCI Learning Center.
Three factors explain the recent prominent growth of artificial intelligence systems, according to MIT researchers Erik Brynjolfsson and Andrew McAfee. An explosion of digital data–such as digital photos and data from sensors–provides much more fodder for AI systems. Computer scientists also have made major strides in improving the artificial-intelligence technology, relying in part on those troves of data. Third, improvements in computer hardware–in particular, the development of a new component called the graphic processing unit–has enabled systems to crunch larger amounts of data faster than before (Brynjolfsson & McAfee, 2017).
Today, machine learning algorithms are actually as good as or better than humans at many things that we think of as being uniquely human. – Jeremy Howard, Research Scientist University of San Francisco
“Today, machine learning algorithms are actually as good as or better than humans at many things that we think of as being uniquely human capabilities,” says Jeremy Howard, a research scientist at the University of San Francisco. Here’s one example: “People whose job is to take boxes of legal documents and figure out which ones are discoverable–that job is rapidly disappearing because computers are much faster and better than people at it” (Kirkland, 2014).
Even so, the growth in the amount of available digital data swamps the capacity of computers to crunch it. That’s where quantum computing is coming in. Classical computers use binary bits–the strings of zeros and ones that sometimes are used to depict what’s inside a computer. Quantum computers, by contrast, manipulate subatomic phenomena from quantum physics to represent numbers and mathematical operations, an approach that allows a quantum computer to handle larger amounts of data in a fraction of the time. Although the technology is in its infancy, quantum computers theoretically would be able to tackle problems that would not be practical for even the beefiest conventional computer. Google, for example, predicts that by the end of 2017 it will produce a quantum computer that will be able to beat the performance of any conventional computer, and others predict that quantum computers will host machine learning systems not long down the road (Nott, 2017; Simonite, 2017).
The third computer technology that is revolutionizing business is 3-D printing. A conventional printer forms a two-dimensional image–a photo or letters, for example–by depositing ink on a flat sheet of paper. A 3-D printer deposits material bit by bit in a three-dimensional space, enabling it to fashion objects. An oddity just a few years ago, 3-D printers have become commonplace and already are making inroads in business, with companies using the printers to manufacture items as different as automotive parts and drug pills.
One of the underlying concepts for 3-D printing was patented in 1980, and engineers have made rapid advances since then in bringing the technology into reality. In recent decades, much manufacturing has shifted to Asia because of low labor costs there, but 3-D printing is expected to upend these global economic relationships. Inexpensive 3-D printing will make it economical to manufacture goods much closer to those who buy the products, even factoring in higher labor costs in the West. “The great transfer of wealth and jobs to the East over the past two decades may have seemed a decisive tipping point. But this new technology will change again how the world leans,” predicts Richard D’Aveni (2013), a professor of strategy at Dartmouth University’s Tuck School of Business.
As a result, manufacturing processes will change. “Whereas today a lot of manual effort and time is spent on planning, managing and executing non-productive tasks such as scheduling, changeovers, maintenance and loading of parts, in the future, large-scale production environments may manage themselves via digital models, simulations, robots, sensors, condition-based systems and artificial intelligence” (Lakner & Van den Bossche, 2014, p. 1). But 3-D manufacturing also offers significant challenges for employees. Manufacturing workers likely will need to upgrade their skills to work in a 3-D environment.
Marialane Schultz, CEO of IOCI, has studied the impacts of artificial intelligence (AI) on business. When asked why these insights are important, Marialane shares, “Now more than ever, there is a growing strategic imperative for senior leaders to begin assessing how work will change, what capabilities will be needed and how organizations must prepare to lead an AI-augmented workforce. AI advances are accelerating and those who prepare are likely to remain relevant and perhaps even enjoy a competitive advantage over those who don’t.”
This is the time to be asking the tough questions: What skills will be prized, who will need to be trained and who will miss out if we do not pivot and address these changes now?
To learn more on how to thrive in an AI-augmented business world, read our white paper today and contact us if you need further coaching to navigate these changing times.