4
36 IEEE SOFTWARE | PUBLISHED BY THE IEEE COMPUTER SOCIETY 0740-7459/16/$33.00 © 2016 IEEE FOCUS: PERSPECTIVES—US Four Thought Leaders on Where the Industry Is Headed Andrew Moore ON THE BASIS of what’s going on at large Internet companies—plus what I see my star faculty working on— it’s clear that many of the new capa- bilities you’ll be seeing in software systems in 10 or 20 years have to do with the ability to adapt to informa- tion that’s streaming in all the time. In the future, software will much less frequently perform clear trans- actions that always return the same answer; instead, software processing will be about piecing together many disparate bits of information. Some of this is already happening in the ranking systems that companies such as Facebook and Google use, which govern how they present you person- alized results. These systems are al- ready ingesting tens of thousands of pieces of information every second, 24/7, day after day. These systems provide all kinds of challenges for software engineers. How do you test and provide quality assurance on systems that are adapt- ing all the time and will be running very differently over time on the ba- sis of accumulated data? Both in- dustry and academia are developing such methods, but this field is only in its infancy. If you’re a software engineer, it’s a crime for you not to be an expert in probability. You’ll be dealing with fuzzy information in many places, and you have to be very comfortable with how to combine it. If you get to take only one expert- level class, don’t take a computer sci- ence class; take a statistics class. Similar complexities also apply to the verification of physical systems as they work in the real world, another emerging discipline about which I’m passionate. Here at Carnegie Mel- lon University, Andre Platzer is ap- plying the formal methods that have proven useful for verifying system behavior to systems at scale such as those for air traffic control and self- driving cars. One way to visualize where this research might be headed: when two ultra-intelligent cars know they’ll collide, and they have a few seconds left before the collision, the negotiation between them must be about how they save as many lives as possible. Verifying that these com- plex systems can reach optimal an- swers regarding life-or-death deci- sions is what’s needed—not so much an “Asimov’s 3 Laws of Robotics ap- proach” based on general principles Over the past several years, almost every domain has experienced increasing de- pendence on software to deliver new, innovative capabilities. Here, four software engineering thought leaders offer challenging, thought-provoking ideas about where our industry is headed. They discuss challenges for software engineers to keep their skill set current and tout the promise of new technologies to advance our field. Their diverse perspectives illustrate diverse ways of envisioning some Next Big Things for software and software engineering. —Guest Editors

Four Thought Leaders on Where the Industry Is Headed · Four Thought Leaders on Where the Industry Is Headed ... healthcare.gov site’s failure, ... Moore’s areas of research and

  • Upload
    lyminh

  • View
    218

  • Download
    3

Embed Size (px)

Citation preview

Page 1: Four Thought Leaders on Where the Industry Is Headed · Four Thought Leaders on Where the Industry Is Headed ... healthcare.gov site’s failure, ... Moore’s areas of research and

36 IEEE SOFTWARE | PUBLISHED BY THE IEEE COMPUTER SOCIETY 0 7 4 0 - 7 4 5 9 / 1 6 / $ 3 3 . 0 0 © 2 0 1 6 I E E E

FOCUS: PERSPECTIVES—US

Four Thought Leaders on Where the Industry Is Headed

Andrew Moore

ON THE BASIS of what’s going on at large Internet companies—plus what I see my star faculty working on—it’s clear that many of the new capa-bilities you’ll be seeing in software systems in 10 or 20 years have to do with the ability to adapt to informa-tion that’s streaming in all the time. In the future, software will much less frequently perform clear trans-actions that always return the same answer; instead, software processing will be about piecing together many disparate bits of information. Some of this is already happening in the ranking systems that companies such

as Facebook and Google use, which govern how they present you person-alized results. These systems are al-ready ingesting tens of thousands of pieces of information every second, 24/7, day after day.

These systems provide all kinds of challenges for software engineers. How do you test and provide quality assurance on systems that are adapt-ing all the time and will be running very differently over time on the ba-sis of accumulated data? Both in-dustry and academia are developing such methods, but this field is only in its infancy. If you’re a software engineer, it’s a crime for you not to be an expert in probability. You’ll be

dealing with fuzzy information in many places, and you have to be very comfortable with how to combine it. If you get to take only one expert-level class, don’t take a computer sci-ence class; take a statistics class.

Similar complexities also apply to the verification of physical systems as they work in the real world, another emerging discipline about which I’m passionate. Here at Carnegie Mel-lon University, Andre Platzer is ap-plying the formal methods that have proven useful for verifying system behavior to systems at scale such as those for air traffic control and self-driving cars. One way to visualize where this research might be headed: when two ultra-intelligent cars know they’ll collide, and they have a few seconds left before the collision, the negotiation between them must be about how they save as many lives as possible. Verifying that these com-plex systems can reach optimal an-swers regarding life-or-death deci-sions is what’s needed—not so much an “Asimov’s 3 Laws of Robotics ap-proach” based on general principles

Over the past several years, almost every domain has experienced increasing de-pendence on software to deliver new, innovative capabilities. Here, four software engineering thought leaders offer challenging, thought-provoking ideas about where our industry is headed. They discuss challenges for software engineers to keep their skill set current and tout the promise of new technologies to advance our field. Their diverse perspectives illustrate diverse ways of envisioning some Next Big Things for software and software engineering. —Guest Editors

Page 2: Four Thought Leaders on Where the Industry Is Headed · Four Thought Leaders on Where the Industry Is Headed ... healthcare.gov site’s failure, ... Moore’s areas of research and

JANUARY/FEBRUARY 2016 | IEEE SOFTWARE 37

for minimizing harm. Another of our CMU faculty, Aaron Steinfeld, is a pioneer in self-driving cars and has turned his attention to policy issues related to crashes.

Formal methods will be a key en-abling technology. Over the years, many of our hearts have repeatedly been broken by the potential for for-mal veri� cation methods and how they haven’t been able to reach the larger population of software engi-neers. Formal methods will continue to be important and will be used effectively as the underpinning of these new capabilities, although ev-ery software developer won’t be us-ing them.

Tim O’Reilly

CODE FOR AMERICA is changing government and changing how we think about our role as software pro-fessionals. Its work is deeply rooted in the notion that you can no longer govern without using digital technol-ogy. Technology is central to how we deliver services today and how peo-ple access them.

The lessons we’ve learned from software development can be ap-plied to make government work bet-ter for everyone. Right now, gov-ernment still operates in ways used early in software development, when methods to manage software proj-ects were more similar to those used to build a bridge. In describing the healthcare.gov site’s failure, Internet pundit Clay Shirky remarked that the waterfall method, in which ev-erything is spelled out in excruciat-ing detail beforehand, is essentially a promise “not to learn anything while doing the actual work.”1

Today’s software progresses through countless small updates, learning as we go, and building iter-atively to create products that work for their intended users. Code for America brings this iterative, data-driven, user-centered approach to building government software and programs. One place we’ve done this is with CalFresh, California’s ver-sion of the Supplemental Nutrition Assistance Program (what used to be called Food Stamps). The team discovered that there were not only software problems but also many problems with how the program in-teracted with its users. The team simpli� ed the application process and ended up building a set of apps that could “debug” the CalFresh system, discovering where people were confused or discouraged, and

understanding why the systems used to deliver the program were inter-fering with its ability to achieve its objectives.

Bringing programmers into gov-ernment through programs such as Code for America (which works with cities and counties) or, at the federal level, the US Digital Service and the General Services Adminis-tration’s 18F unit, is teaching govern-ment to work in this user-centered, data-driven way. By putting the user rather than the government itself at the center of service design, we hope that Internet technologies will let us rebuild the kind of participatory government our nation’s founders envisioned.

Something else we can teach government: in the private technol-ogy sector, we’ve learned to build fundamental platforms and reuse them. We don’t build monolithic ap-plications over and over again from scratch. We need to start building government software as a set of APIs

ANDREW MOORE is the dean of Carnegie Mellon University’s School of Computer Sci-ence. He previously was the vice presi-dent for engineer-

ing at Google, where he established an engineering of� ce in Pittsburgh. There, Moore was responsible for the retail seg-ment: Google Shopping. Google Pittsburgh is responsible for machine-learning and distributed-systems components of many of Google’s advertising and e-commerce systems. Moore’s areas of research and deployment include AI, Bayesian networks, customer churn analysis, counterterrorism analytics, data mining, machine learning, medical informatics, national retail data analysis, computational physics, process control and optimization, and experimental design, to name a few. He has taught graduate and undergraduate classes in AI, machine learning, and computational statistics and algorithms. His data-mining tutorials have been downloaded more than a million times.

TIM O’REILLY has a history of conven-ing conversations that reshape the industry. If you’ve heard the terms “open source soft-ware,” “Web 2.0,”

“the maker movement,” “government as a platform,” or “the WTF economy,” you’ve heard about something he’s had a hand in framing. He’s the founder, CEO, and chair-man of O’Reilly Media and a partner at the early stage venture � rm O’Reilly AlphaTech Ventures. He’s also on the boards of Maker Media (which was spun out from O’Reilly Media in 2012), Code for America, PeerJ, Civis Analytics, and POPVOX. He talks here about Code for America and its mission of making government work for the people and by the people in the 21st century.

Page 3: Four Thought Leaders on Where the Industry Is Headed · Four Thought Leaders on Where the Industry Is Headed ... healthcare.gov site’s failure, ... Moore’s areas of research and

38 IEEE SOFTWARE | W W W.COMPUTER.ORG/SOFT WARE | @IEEESOFT WARE

FOCUS: PERSPECTIVES—US

and platforms. Government can take many practical steps right now to move in this direction, which I de-scribe in my essay “Government is a Platform.”2 In the future, we can expand these ideas to � gure out how government can become a more open platform that lets people inside and outside the government innovate and that lets outcomes evolve through in-teractions between the government and its citizens.

Another lesson from technol-ogy that we need to explore is how to build modern regulatory systems. You can think of Google search quality, banks’ credit card fraud de-tection, antispam systems, or the reputation systems used by many modern applications as a kind of regulation. But these regulatory sys-tems work very differently from gov-ernment regulatory systems, which basically specify a set of rules. Regu-latory systems should focus on out-comes and adjust the rules (just as a fraud detection or search quality system continually adjusts to de-tect new kinds of attacks) to help achieve those outcomes. Too often, rule-based systems fail, and no one notices. We need to start thinking about government regulations: did they achieve their outcomes? If not, rethink or tweak them. Don’t just consider them a success because they got passed into law!

There’s a real opportunity to bring the skills we’ve learned as de-velopers, user experience designers, and data scientists into the public sector. Government is a huge part of our economy; if we don’t bring it into the 21st century, it will hold all of us back. Fortunately, there are now many initiatives, such as Code for America, through which software industry professionals can make a real contribution.

References 1. C. Shirky, “Healthcare.gov and the

Gulf between Planning and Reality,”

blog, 2013; www.shirky.com/weblog

/2013/11/healthcare-gov-and-the

-gulf-between-planning-and-reality.

2. T. O’Reilly, “Government as a Plat-

form,” Open Government, E. Lathrop

and L. Ruma, eds., 2010; http://

chimera.labs.oreilly.com/books

/1234000000774/ch02.html#lesson_2

_build_a_simple_system_and_let_i.

Paul D. Nielsen

IN THE NEAR TERM, we need to blend the best features of the agile and disciplined approaches to soft-ware engineering. Both have contrib-uted—but each might fail in the ex-treme. We’ve seen some attempts to balance these approaches, but I don’t believe we’ve quite got it yet.

We need to develop capabilities quickly—but we also need to de-liver those capabilities with quality assurance and fewer vulnerabilities. Agile lets us get working software in front of users as quickly as pos-sible and get valuable feedback. But this might not always � t well with safety- or life-critical applications—for example, a self-driving car or an implanted medical device.

Software engineering has also be-come more about the composition of existing functionality while add-ing some innovative features and not as much about creating all the functionality from scratch. I believe that good software architecture will enable this. At one extreme, agile might not pay enough attention to software architecture. At the other extreme, the more disciplined ap-proaches might pay too much atten-tion to architecture at the expense of moving forward. How do we � nd

that right amount of architecture to keep our velocity and quality high—and allow for continued system evo-lution without costly refactoring?

Other trends and advances are important. We’re building larger, more complex, and more connected systems. These systems are hard to test and verify through our tradi-tional approaches. To address this challenge, we need to � nd new strat-egies to blend traditional testing, new advances in formal methods, modeling and simulation and auto-mated testing, and continued data collection after � elding. Obviously, the approach and mix will differ—there will be a different veri� cation-and-validation (V&V) strategy for a game application than for the na-tion’s air traf� c control system.

Furthermore, besides systems that are just larger or more connected, we’re entering an era of systems with

PAUL D. NIELSEN is the director and CEO of Carnegie Mellon University’s Software Engineer-ing Institute (SEI), a global leader in advancing software

and cybersecurity to solve the nation’s toughest problems through focused re-search, development, and transition to the broad software engineering community. The SEI is a key innovator in areas central to US Department of Defense and civilian government operation in the cyberspace domain, including software architecture, software product lines, interoperability, the integration of software-intensive systems, network and system resilience, and the increasing overlap of software and systems engineering. The SEI also directly supports more than 50 US government entities in their efforts to ef� ciently and ef-fectively acquire and sustain new software and systems.

Page 4: Four Thought Leaders on Where the Industry Is Headed · Four Thought Leaders on Where the Industry Is Headed ... healthcare.gov site’s failure, ... Moore’s areas of research and

JANUARY/FEBRUARY 2016 | IEEE SOFTWARE 39

increasing autonomy. It’s not as use-ful to argue over whether a system is autonomous as it is to understand that we’re seeing a general growth in autonomous functionality. We’re delegating more decision making to our software. There will be a spec-trum of autonomy in our systems, and how much autonomy we let our systems exercise might be dynami-cally allocated on the basis of the situation or environment.

Autonomous-system development, architecture, and V&V will further stress our current software engineer-ing practices. For example, as we de-velop systems that can learn in the � eld, how will we determine when they’re suf� ciently ready for release and won’t go awry?

And yet, the advent of more-autonomous systems might also greatly help software engineers by providing new tools to address our current challenges. Development en-vironments with more-autonomous features might help us quickly create high-quality, less vulnerable secure code.

We might be able to gather � eld data in volume and at high speed to improve code functionality and se-curity much more ef� ciently. These tools might let software engineers concentrate more on high-end in-novation and creativity—the part of our profession we most love.

I believe it’s an especially excit-ing time for software engineering and software engineers. The future is ours for the making!

Kevin Fall

AN IMPORTANT TREND in soft-ware engineering has been the desire to have more reusable code. Software development is more an exercise

in composability than authorship. What we’ve seen recently as a re-sult is the creation of more and more frameworks that let developers pro-vide functionality more easily from reusable pieces—but generally with little validation and with an uncer-tain or unknown supply chain.

We also have technologies that can provide important software ca-pabilities and help assure software quality. Examples include formal methods, system analysis, and ma-chine learning for autonomy and control. But these technologies are dif� cult for the average software en-gineer to fully exploit.

The challenge is to get these technologies to work together ef-fectively so that we can build bigger systems with con� dence. How can we validate that systems built with these technologies will perform as intended? These problems will likely be addressed in the next � ve to 10 years.

Twenty years out, I see many challenges arising from how we’re going to effectively build software that works on advanced hardware. We already have quantum key dis-tribution. Twenty years from now, if our software needs to work with quantum computers, for example, how will we reason about those soft-ware systems’ performance? How will you write software in a proba-bilistic system that will give you a range of answers to a calculation, not always a single “correct” one?

I think future software engineers will be able to use these technologies as an expanded set of tools in their toolbox. A developer would be able to pick these things up and maybe not understand them all in detail but have con� dence that they’ll work. As I mentioned, software engineer-ing is an exercise in composition,

and software engineers will need to be able to import an artifact from who knows where and understand important attributes about it. The component itself might make a claim about its properties, but how does a developer know to trust it? (This dovetails nicely with security prop-erties. People who work in malware are essentially trying to do the same analysis for a certain class of attri-butes on components.)

So will we have a meaningful software � eld 20 years from now (versus a collection of specializations that don’t share much)? Yes. The ability to compose radically differ-ent software components will be a common theme. The trick is whether mere mortals can actually do the programming, as the set of needed competencies continues to grow (and the competencies of the future might require disciplines that haven’t been invented yet). Maybe automata will do the programming for us, and the question will be whether we can specify the end result well enough.

KEVIN FALL is the deputy director and chief technology of� cer of the SEI, where he directs the R&D portfolio of the SEI’s techni-cal programs in

cybersecurity and software engineering. He’s also an adjunct professor at Carnegie Mellon University’s School of Computer Science.

Selected CS articles and columns are also available for free at http://ComputingNow.computer.org.