Lead story – AI adoption and ethics – and end of year gut check
Views on AI adoption are all over the map. I recently heard backchannel talk about how AI isn’t being adopted by lines of business. As Derek writes, McKinsey’s data says different: McKinsey – AI adoption is leveling off, but AI leaders are pulling ahead. However, the data is a mixed bag:
Since the first survey was done in 2017, McKinsey has seen AI adoption increase from 20 percent of respondents to approximately 50 percent. However, adoption peaked in 2019 at 58 percent and has since leveled off.
We can debate the definition of AI till the cows come home. I suspect many of the top use cases wouldn’t reach my “intelligent” threshold for AI, but, what are those top use cases? Derek:
Of the capabilities used, some 39% said that they’re using RPA in at least one business process or product, the highest performing. This is closely followed by computer vision (38%), natural language text understanding (33%), virtual agents (33%), deep learning (30% and knowledge graphs (25%). The capabilities with the lowest penetration at the moment are transfer learning (16%), generative adversarial networks (11%), and transformers (11%).
Where is the value coming from? Derek quotes McKinsey:
Today, the biggest reported revenue effects are found in marketing and sales, product and service development, and strategy and corporate finance, and respondents report the highest cost benefits from AI in supply chain management.
The bottom-line value realized from AI remains strong and largely consistent. About a quarter of respondents report this year that at least 5 percent of their organizations’ EBIT was attributable to AI in 2021.
That matches my take. AI has a decent level of enterprise adoption, and some metrics that justify its value. This, dear reader, is why I’m so comparatively hostile to Metaverse, blockchain, and other areas where the hype machine is so far ahead of adoption and ROI. But AI has other problems: many of the use cases, operating at scale, are arguably controversial or even detrimental. Neil Raden has documented many of these problems via his diginomica articles.
In sum: our ethics (and our regulations) need to, somehow catch up with our live AI use cases. Ask San Francisco about that: San Francisco killer police robots policy halted after backlash. Neil has tackled AI ethics frequently, as has Chris, who filed a new one this week: AI – where do ethics stop, and regulations begin? Chris shared a provocative quote from the Director of the UK’s Ada Lovelace Institute:
Our public research shows again and again that people would like to see more regulation, even if it comes at the cost of innovation and more products.
That brings things to a head, eh? I’d suggest that’s where the 2023 conversation on AI should begin.
Diginomica picks – my top stories on diginomica this week
Vendor analysis, diginomica style. Here’s my three top choices from our vendor coverage:
A few more vendor picks, without the quotables:
Jon’s grab bag – Neil does some digital twins hype-busting in Digital twins have their place – but is healthcare one of them? The space economy may not be as futuristic as we think. That’s the message from both Chris (Why UK space tech is now ready for take-off) and Neil (The space economy matters – an enterprise view on space as the International Space Station heads towards de-orbit).
Finally, if you’d like a bit of satirical spleen vent after the tech predictions deluge, join Brian Sommer and I for this annual tradition: the un-predictions (The 2023 enterprise software un-predictions). There’s a little something for everyone here, but readers enjoyed the digital hug that got me into trouble, “quiet answering,” and “decomposable apps.”
Best of the enterprise web
My top seven
- Amazon, Google, Microsoft and Oracle will share the Pentagon’s $9 billion cloud contract – Why choose between hyperscalers when you can just choose them all? Even better: you don’t be taken to court about it.
- re:Invent or re:Position? AWS tries to ‘out Google’ Google on the importance of your data strategy – It took a week, but we finally got a good piece of re:Invent analysis, via a strongly-worded piece from HfS Research: “If you move your existing crap into the cloud, you’ll end up with even worse crap… and less than 50% of cloud native transformations are currently successful.”
- 3 Reasons You Might Select the Wrong Transformation Partner – 2022 is almost over, but Ted Rogers of UpperEdge puts his hat in the ring for best piece on services vendor evaluation this year.
- Four Strategies That Worked – After so much effective deconstruction of supply chain missteps, good to see Lora Cecere narrow down the most important keys to SCM effectiveness. Descriptive analytics and supplier development (underrated) stood out.
- New Ransom Payment Schemes Target Executives, Telemedicine – These clever schemes are worth studying – it would not be that hard to make one erroneous click and get caught up in one.
- DevOps and AIOps – what is their relevance in ERP shops? A post-SAP TechEd roundtable – I didn’t think SAP TechEd put enough emphasis on DevOps and AIOps, so I set about to correct that, via an old school podcast with John Appleby and Brenton O’Callahan of Avantra, and Martin Fischer of Neptune Software (Fischer is also the lead for DSAG’s DevOps interest group).
- AI’s Jurassic Park moment – Bad actors “are likely to use large language models as a new class of automatic weapons, in their war on truth.” Gary Marcus’ focus here is the urgent one, not whether chatGPT can graduate from college, or write convincing college essays – though that’s clearly an issue for academia. Bonus points for roping in Jeff Goldblum in Jurassic Park.
- ChatGPT: AI is now a decent writer. So you need to be better – Without Bullshit’s Josh Bernoff says: “I think from this point forward, you cannot be certain whether what you read was created by a machine or a human.” Well, except for the bot-free column you are reading now. When robots figure out how to imitate this, I’m done for.
Yeah, all I need to hear next to me on a plane is some sales hunter pressing the issue with a prospect:
Even if phone calls are possible on airplanes, it’s still rude https://t.co/FUkHFilx8Q
“According to the Washington Post, European regulators and airlines are now investigating ways to enable 5G mobile phone calls on airplanes. “
->think air travel is bad enough now, just wait
— Jon Reed (@jonerp) December 11, 2022
Oh, and this happened:
Facebook Asks Lawmakers Not to Regulate Crypto Too Harshly Just Because of All the Fraud https://t.co/zK7kfQ3gh7
“Facebook is a place for friends…”
and frauds. 🙂
monetizing the metaverse is hard enough without having to clamp down on scams and grifts…
— Jon Reed (@jonerp) December 11, 2022
One of my least interesting tweets of the week went kinda viral:
I hope you’re sitting down for this one, but I just got an exceptionally bold tech prediction for 2023:
“Organizations will continue to struggle with data in 2023”
-> it takes real guts to go out on such limbs 🙂
— Jon Reed (@jonerp) December 8, 2022
I guess data remains a conundrum, and peeps don’t care for the tech predictions barrage… But as for AI taking our content jobs, I’ll leave you with my upbeat thoughts on that:
– a compelling storyline/narrative (fiction or non, AI sucks at that)
– satire (AI can only guess at that)
– subjectivity/honesty, inserting your own experience into your theoretical constructs.
In other words, show some guts and some craft, and own your bias https://t.co/W5Q9tDiQDX
— Jon Reed (@jonerp) December 11, 2022
This may be the last hits/misses of the year – unless I do a year-end edition. Thanks for the readership and spicy comments – see you on the other side. If you find an #ensw piece that qualifies for hits and misses – in a good or bad way – let me know in the comments as Clive (almost) always does. Most Enterprise hits and misses articles are selected from my curated @jonerpnewsfeed.