This issue reminds me a lot of what we do in security and privacy, which are ostensibly engineering disciplines, but are really technology governance functions. Ethical AI is also a governance function, and like hackers in security, there is an invisible yet stark limit to the value activists can provide in actual governance roles. Litigating demands is just not how governance gets done.
The irony of Google hiring so many smart people is that they get a cluster people who were institutionalized by academia, and now they have a CEO who has to act like an institutional dean.
Hackers hold your products to their standards, and ideally that adds value to your customers. Ethical AI people seem to do the same. The problem with both is that if you fail their purity tests, they will also sabotage your company because they're indexed on a greater good.
That the company is dealing with factions of what are essentially student activists issuing demands seems like an inflection point where any further growth is going to be a function of market domination and not their ability to align to make things.
"Focused, hard work is the real key to success. Keep your eyes on the goal, and just keep taking the next step towards completing it. If you aren't sure which way to do something, do it both ways and see which works better." --John Carmack
I would submit that Rob Mee (founder of Pivotal Labs, former CEO of Pivotal) is one of the greatest unsung heroes of Silicon Valley. So many of the biggest success stories in tech today were either advised by him or worked with Pivotal Labs in their early stages.
Both Twitter and Square are notable examples, but it is a very, very long list. Google also worked with Pivotal Labs in their early days.
Sorry, Christopher Alexander, who wrote the book called "A Pattern Language" (and several others) which was the origin of the inspiration for design patterns in coding.
He also later wrote a foreword for a book by Richard P. Gabriel, "Patterns of Software", with this gem of a quote: (the book itself I really recommend for anyone interested in design patterns)
...
In my life as an architect, I find that the single thing which inhibits young professionals, new students most severely, is their acceptance of standards that are too low. If I ask a student whether her design is as good as Chartres, she often smiles tolerantly at me as if to say, “Of course not, that isn’t what I am trying to do. . . . I could never do that.”
Then, I express my disagreement, and tell her: “That standard must be our standard. If you are going to be a builder, no other standard is worthwhile. That is what I expect of myself in my own buildings, and it is what I expect of my students.” Gradually, I show the students that they have a right to ask this of themselves, and must ask this of themselves. Once that level of standard is in their minds, they will be able to figure out, for themselves, how to do better, how to make something that is as profound as that.
Two things emanate from this changed standard. First, the work becomes more fun. It is deeper, it never gets tiresome or boring, because one can never really attain this standard. One’s work becomes a lifelong work, and one keeps trying and trying. So it becomes very fulfilling, to live in the light of a goal like this. But secondly, it does change what people are trying to do. It takes away from them the everyday, lower-level aspiration that is purely technical in nature, (and which we have come to accept) and replaces it with something deep, which will make a real difference to all of us that inhabit the earth.
I would like, in the spirit of Richard Gabriel’s searching questions, to ask the same of the software people who read this book. But at once I run into a problem. For a programmer, what is a comparable goal? What is the Chartres of programming? What task is at a high enough level to inspire people writing programs, to reach for the stars? Can you write a computer program on the same level as Fermat’s last theorem? Can you write a program which has the enabling power of Dr. Johnson’s dictionary? Can you write a program which has the productive power of Watt’s steam engine? Can you write a program which overcomes the gulf between the technical culture of our civilization, and which inserts itself into our human life as deeply as Eliot’s poems of the wasteland or Virginia Woolf’s "The Waves"?
At a former job I had to build a reporting system. And had the trust to build it however I wanted. But it was my job to make it sufficiently useful that everyone would use it.
The smartest thing that I did was make it accessible from Excel. You could build a spreadsheet off of my report. Refresh the spreadsheet, the report ran, you got updated data.
I got essentially 100% adoption, and the rest of my job was spent finding people who needed data and adding it as an option to the reporting system.
Usually you expect a complex reporting system to have features like graphing, pivot tables, etc, etc, etc. My answer to all of that was, "You can already do that in Excel. I could spend a lot of time on it but I'm not going to do it as well as what you already have."
The irony of Google hiring so many smart people is that they get a cluster people who were institutionalized by academia, and now they have a CEO who has to act like an institutional dean.
Hackers hold your products to their standards, and ideally that adds value to your customers. Ethical AI people seem to do the same. The problem with both is that if you fail their purity tests, they will also sabotage your company because they're indexed on a greater good.
That the company is dealing with factions of what are essentially student activists issuing demands seems like an inflection point where any further growth is going to be a function of market domination and not their ability to align to make things.