Welcome back to This Week in Modern Software, or TWiMS, our weekly analysis of the need-to-know news, stories, and events of interest surrounding the software and analytics industries.
This week, our top story addresses the future of AI … and the human race.
TWiMS Top Story:
Inside the Artificial Intelligence Revolution: A Special Report, Pt. 1—Rolling Stone
What it’s about: If you’ve been looking for the latest indicator that machine learning and AI are about to burst out of the lab and into the mainstream, look no further: Rolling Stone, not exactly a must-read in the tech industry, is serving up a big two-part feature on “the artificial intelligence revolution.” Part one went online this week, and it’s well worth the read. Author Jeff Goodell dispenses with some of the entertainment and media tropes around AI—bow down before your robot overlords, and so on—in favor of what AI, like so much of modern software, is really all about: Algorithms. “Algorithms are to the 21st Century what coal was to the 19th Century: the engine of our economy and the fuel of our modern lives,” Goodell writes. “In the world of AI, the Holy Grail is to discover the single algorithm that will allow machines to understand the world—the digital equivalent of the Standard Model that lets physicists explain the operations of the universe.” Of course, no one’s found that yet, and true AI isn’t actually here yet: “AIs are nowhere near as smart as a rat,” Facebook director of AI research Yann LeCun tells Goodell. But that may changing faster than many people realize.
Why you should care: Part one of Goodell’s piece offers a thorough look at the current state of machine learning and AI research. It also explores the enormous questions and issues that AI raises—and that very much remain to be answered. Stephen Hawking, for one, has noted that AI doesn’t actually need to turn evil to wreak havoc. “The real risk with AI isn’t malice but competence. A superintelligent AI will be extremely good at accomplishing its goals, and if those goals aren’t aligned with ours, we’re in trouble,” Hawking said recently. Moreover, Hollywood’s vision of AI is dangerous for a different reason: “The problem with the hyperbole about killer robots is that it masks the real risks that we face from the rise of smart machines—job losses due to workers being replaced by robots, the escalation of autonomous weapons in warfare, and the simple fact that the more we depend on machines, the more we are at risk when something goes wrong, whether it’s from a technical glitch or a Chinese hacker,” Goodell writes. And don’t miss part 2 of Goodell’s piece, which covers “self-driving cars, war outsourced to robots, surgery by autonomous machines,” and more.
- This Week in Modern Software: The Dawn of the AI-Powered Web—New Relic Blog
What it’s about: Speaking at the RSA Conference 2016, U.S. Secretary of Defense Ashton Carter effectively weighed in on the encryption showdown between the FBI and Apple, according to multiple reports. “I’m not a believer in back doors or a single technical approach. I don’t think it’s realistic,” Carter said in an onstage discussion with Kleiner Perkins Caufield & Byers General Partner Ted Schlein, according to Newsweek’s report. While Carter stated the Pentagon is in favor of strong encryption, he doesn’t believe there’s a single solution for handling it in law enforcement scenarios. “We shouldn’t let one case drive the solution,” Carter said, according to VentureBeat’s recap. “We have to innovate our way to a sensible result.” In a sense, that was something of an overall security theme this week: RSA president Amit Yoran echoed his 2015 keynote by telling Fortune that security is about much more than the latest and greatest tools: “There are no silver bullets in security.”
Why you should care: Carter, like the president who appointed him, has made modern software and technology a significant priority during his tenure. He has also been developing deep connections with Silicon Valley to help modernize the Pentagon’s technology strategy. From a November WIRED profile of Carter: “He believes the hardware and software its engineers and entrepreneurs dream up have unlimited potential to help the military do its job. What’s more, he argues, Silicon Valley—and tech entrepreneurs more broadly—can teach Defense a lot about flexibility, speed, and new ways to work.” VentureBeat’s Blaise Zerega notes that Carter’s RSA comments on encryption “seemed to be an olive branch to the tech community amid the bitter acrimony arising from Apple’s standoff with the FBI.”
- U.S. Secretary of Defense on Apple Encryption: “I’m Not a Believer in Backdoors”—VentureBeat
- Alphabet’s Eric Schmidt Heading Pentagon’s New Innovation Board—ZDNet
- More Than 11 million HTTPS Websites Imperiled by New Decryption Attack—Ars Technica
- Darktrace’s ‘Digital Antibodies’ Fight Unknown Cybersecurity Threats with Machine Learning—ZDNet
- U.S. Military Launches Cyber Attacks on ISIS in Mosul, and Announces It—Ars Technica
- Cybersecurity ‘Bling’ Won’t Save Us: RSA President—Fortune
- This Week in Modern Software: The Pentagon Comes to Silicon Valley—New Relic Blog
What it’s about: Can boring-by-design enterprise software—think ERP and other core applications that have traditionally not concerned themselves too much with modern UI/UX best practices—become “sexy” when actual design gets prioritized? That’s the pitch of firms like Swedish developer Bang a New Ground, whose Farewell software for supply chain managers aims to bring the design principles favored by the likes of Google or Instagram to a business function that has long wandered in a usability desert. Farewell creator Peter Nilsson takes ERP stalwarts like SAP to task for their lackluster user experiences. But even SAP has purchased a ticket on the design-first software train, winning software design awards way back in 2014 and promoting Sam Yen to Chief Design Officer. Fast Company writer John Pavlus asks a good question, though: “Is making enterprise software ‘sexy’ even a good thing?”
Why you should care: Sexy may be a poor choice of words in corporate environments, and consumerization might have its limits for the enterprise software that powers modern business. But, yeah, usability is a huge deal, and SAP—not exactly synonymous with UI/UX innovation—could be a bellwether indicator of a shift in enterprise software development. Yen shares a compelling reason why SAP didn’t pay more attention to design and usability issues earlier in its history: Its customers weren’t asking for it. Now, they are. Yen tells Co.Design: “Effective user experience design is now table stakes in the enterprise—it’s not optional. No matter how strong your business case is, you can’t just count on employee adoption.”
- Which Generation Is Most Distracted by Their Phones?—Priceonomics
Life and Death in the App Store—The Verge
What it’s about: Over at The Verge, Casey Newton profiles the rise and fall of mobile app developer Pixite’s fortunes. The small firm, which makes photo editing and design apps like Tangent and Assembly, saw its app store-generated revenues double from 2013 to 2014—to nearly $1 million—while racking up accolades from Apple for its apps. But in 2015, Pixite dive-bombed back to earth: It shed a third of its revenue in 2015, according to Newton’s piece, and the largely bootstrapped company was no longer bringing in enough money to meet its costs. Now, Pixite says it’s back to break-even thanks to the early success of its Pigment app, but the shop is still struggling to find a path to sustainability in an app environment oversaturated with choices and competitors. Pigment could suffer a similar fate not just to other Pixite apps, but developers at large: Downloads surge after launch and then crater, while existing users ease into apathy.
Why you should care: For an industry that will soon hit $100 billion in annual revenue, it sure is hard to make money in mobile apps these days. Venture-backed super-unicorns like Uber might make it look easy. Ditto the rare game developer that hits pay dirt with a title like Angry Birds. But for most mobile developers, especially those building consumer-facing apps, it’s now very tough sledding. Sure, Pixite’s story could end up as a business-school case study: As Newton writes, “Pixite was building apps that sat uncomfortably between professional design tools and novelty consumer apps. It failed to create a distinctive brand, instead giving each app a name unrelated to everything that came before.” But in spite of mobile’s ubiquity and the staggering stats from firms like App Annie, even the success stories—and Pixite would certainly have qualified a short time ago—can struggle. Redpoint Ventures VC Ryan Sarver tells Newton: “[The mobile market] seems to be aggregating to a smaller set of winners who are getting more of the rewards. It’s trending to only get harder and harder than it is right now.” Yikes!
- This Week in Modern Software: The State of Cloud—New Relic Blog
- This Week in Modern Software: The People Behind the Technology That Wins Elections—New Relic Blog
- Follow the Money: 4 Financial Lessons for Mobile App Developers—New Relic Blog
Welcome to the Post-Cloud Future—InfoWorld
What it’s about: In the heavily hyped battle between on-premise data centers and the cloud, the latter has clearly won, writes InfoWorld’s Galen Gruman. But the consequences of that victory are often misunderstood as a binary, all-or-nothing choice between complex, customized internal systems and simple, generic cloud environments. It doesn’t have to be that way: “Vanilla shouldn’t be the only flavor” in the post-cloud era, Gruman writes, pointing to Tribune Media’s IT makeover with a mix of cloud, cloud-like, and traditional technologies as the new normal. Gruman’s post-cloud world is defined by “the triumph of a few key notions whose deployment depends on the right combination of price, complexity, responsiveness, flexibility, and strategic advantage…. How you deploy is a tactic, not a goal.” The key is that “The cloud itself isn’t the point. It’s an architecture, a method, a pattern. What makes a good cloud good also makes any data center good.”
Why you should care: Gruman outlines five “key notions,” beginning with “design for change.” The rise of technologies like containers and microservices are good examples of this mentality. Another key notion: “Favor results over control.” Gruman thinks many IT leaders overvalue control, and the public cloud often involves ceding some control to third-party providers. He writes, “Control is overrated and has a large cost. If you get the results you need from someone else’s efforts at a lower cost or burden, you don’t need control.”
Also at InfoWorld this week, Cloud Technology Partners’ David Linthicum shares three critical areas enterprise cloud architects often overlook in their deployments: governance, management, and security. Security shouldn’t surprise you, and governance might be overlooked for good reason: Linthicum equates governance to having brakes on a car: You don’t need it until you’re already up and running, which leads to people skipping it in the planning phase. The management piece is especially interesting: Just because you’re giving up some control in the public cloud, you’re not washing your hands of responsibility: “This means performance management, SLA management, and logging,” Linthicum writes. “Though public clouds are managed by unseen minions, you are responsible for managing your own cloud instances.”
- 3 Critical Issues Cloud Builders Overlook—InfoWorld
- Can Oracle Become a Public Cloud Power?—ZDNet
- How IBM, Google, Microsoft, and Amazon Do Machine Learning in the Cloud—InfoWorld
- Salesforce’s Enterprise Cloud Problem—The Information (Paywall)
- Brokers, Not Owners: 5 Best Practices for Modern CIOs—New Relic Blog
- The Hybrid Cloud SLA Challenge: Before, During, and After—New Relic Blog
What it’s about: Just because you can code doesn’t mean you can teach. And vice versa. Which presents a dilemma for the federal government’s multi-billion-dollar effort, dubbed Computer Science for All, to add computer science and related skills to public school curricula in the United States. The vast majority of teaches don’t know code: So who, exactly, is going to do the teaching? Fast Company’s Jessica Hullinger tackles the possible answers, starting with: “One thing it doesn’t mean, or it really shouldn’t mean, is that we replace any existing teachers with engineers or computer science specialists.”
Why you should care: It may be wiser to teach teachers how to code rather than teach coders how to perform in a classroom. Adam Enbar, cofounder of the coding academy the Flatiron School, tells Hullinger: “Learning how to code is certainly not an easy task, but it pales in comparison to learning how to teach.” Still, bringing teachers up to snuff on the ins and outs of Java isn’t something you accomplish over a couple days of training. Hullinger offers up four ways for teaching our public school teachers to code. One example: “Teach the basics, and teach them often.” Teaching eighth graders basic fundamentals of computer science doesn’t require the same skill set needed to, say, get hired by Google to work on machine learning and AI. “They don’t need to be experts; they need to know enough,” Enbar explains.
Want to suggest something that we should cover in the next edition of TWiMS? Email us at firstname.lastname@example.org.
Tune In to the Future
Can’t get enough modern software news and commentary? Be sure to check out our new Modern Software Podcast. New Relic Editor-in-Chief Fredric Paul and guests discuss the most important things happening in the world of software analytics, cloud computing, application monitoring, development methodologies, programming languages, and more. Listen to episode 8 or subscribe on iTunes.