> I know that you didn’t ask for this job. You didn’t ask for this role in society. None of you, not one of you, wants to think about the many people that can be affected by one fucking perfectly normal bug or mistake in the technology that you built. And this is one of the reasons we keep our heads down. No one became a geek because they wanted to be the center of political attention.
> That just happened.
> You don’t get to choose. You don’t get to choose what era of history you live in and what that era wants to do with you. And this is a moment when it’s all up for grabs. That’s what it means to say we’re on a burning planet. And what it means to say that we don’t have neutral ground is that you’re at the center of that fire. You set it. You’re one of the people that set it. You’re one of the people that tend it. And everything you do, the changes you make over the next months and years are going to chime down decades and centuries and shape the lives of people you will never know [...]
The solution to social problems are not going to come from nerds sitting in ivory towers, who so far have been happy to take credit for unintended positive consequences and look the other way or get defensive when negative consequences show up.
The is the wrong class of people to be guilt-tripping. As dumb as trying to guilt trip someone on Wall Street. Not that guilt tripping works at all any class of people. It just produces short term fixes under pressure with costs down the line.
Techies need to be guided by social scientists, psycologists, anthropologists, politicians local to different cultures. These are the people on front lines dealing with the consequences. That is the only route to a better place.
As necessary as what you suggest may be, it won’t come without costs. It’s likely to slow technological progress down to a crawl as every step is painstakingly debated as well as render innovation more vulnerable to political and economic interests — anybody who decides they have an axe to grind about any particular technology or surrounding field can just construct a wall of false ethical questions to debate and keep it perpetually bogged down.
I’m not arguing against having more thought put into what we do, but taking it too far could easily put us into a collective rut that is nearly impossible to get out of.
If these people are so important, why aren't they making the millions and the decisions? The money and the power are in the technologists and management. We need a route to your proposed route. Some way to convince large organizations of software developers that ethics is, somehow, more important than money.
> If these people are so important, why aren't they making the millions and the decisions? The money and the power are in the technologists and management.
You literally answered yourself. The key is that "important" != "earns money or has power".
"Apolitical" spaces are a tool. Specifically, it's a polite fiction that lets people drop their overt political agendas in favor of getting shit done, working together, enjoying a hobby, remaining on good terms with people, eating dinner with family, or whatever other highly valuable thing people get together for.
It's not "politics doesn't exist". It's that the advantages of declaring a political cease-fire in many contexts outweighs the cost of not being able to advance your personal agenda. This is especially true in many engineering and technology contexts, where creating a product that generates business value for your company is orders of magnitude more important than your political concerns. Even if you think your politics is extraordinarily important, the answer is likely to donate a portion of your salary to political groups rather than politicizing the workplace.
That depends on what you mean by "politics" - if you're being unfairly discriminated against on race, gender etc grounds, that's something that other people have already brought to the space. Your options are to accept it (at a permanent disadvantage) or complain (risk being dismissed as "politics").
Sometimes the political is really personal.
("Apolitical" is like "leaderless" or "self-driving": there really is a leader or a driver or a politics, it's just that everyone agrees to pretend that they don't exist or their intermittent intervention isn't crucial)
Well yeah, your choices in general with an apolitical space are to either break the no-politics cease fire or decide that it's bad enough that you're not getting a good enough deal to stay quiet. People tend to underestimate the value of the cease fire, though - I wouldn't break it at my workplace for a $5k/yr raise, more than that might be worthwhile though.
The people who run water systems, food distribution networks, defense systems and similar should be people who would reasonably accept that their responsibility to the system comes before [insert issue].
Every possible opinion is represented out there somewhere, and there will be someone who honestly believes, eg, that shutting down the electricity grid for 24 hours to make a point about Trump is justified. Or a point about immigration. Or a point about socialism, or what have you.
What is usually meant by 'politics' is that the people who are in charge of workspaces should be on the exact other end of the spectrum to these hypothetical activists. If anything even tangentially threatens the safe operation of the system, it should be a priority over political issues.
I think what you're not addressing though, is that often the work engineers are doing is political. No one can doubt that Facebook has fundamentally changed politics, no one can doubt that the way they set it up had spectacular effects on politics in this country - and that they're now deliberately modifying their system to have a specific political impact. Maybe if they'd thought about the politics of their system as part of their design stage, they would've thought about what damage they were going to cause. The choice not to has arguably been a massive blow to democracy.
Say a conservative becomes CEO of AT&T. Should they consider the "damage" that will be caused by routing calls for liberals, embrace their political responsibilities, and kick suspected left-wingers off the PSTN?
Republicans are real people who really make up approximately half the voters in this country. The "threat to democracy" here is the suggestion that it's a moral imperative to rig core communications infrastructure against them.
Facebook hasn't "fundamentally changed" anything. Right wingers just happened to be savvier at manipulating the tools it provides, this time around. If anything, that's a surprise. One would expect social media to be good at rallying younger, tech-savvier, more left-leaning voters. One would expect the DNC to run the savviest social-media campaign, based on its track record in '08 and '12.
I fundamentally disagree. Ethical consumption is a cop-out, especially for those who have means (most people on this site).
Real integrity comes from ethical production. Place your time and skills towards companies and projects that at the very least do not make the world worse. And if you can't do that, I do not think you get to get on a soap box and talk about how great you are because you shop for local produce, eat vegan, and drive an electric car.
Everyone is as evil as their most sociopathic executive, you both contribute to the same bottom line but the employee sells out for less.
I'm not saying to choose what projects to work on or what companies to work for in a impact-on-society neutral way. I'm saying that once you've decided that doing X is the best thing to do, then having the politics be considered "settled" by everyone involved is a great way to stop wasting resources on politics in favor of getting the shared goal done.
I would think it's quite premature to consider numerous courses of action in tech, particularly concerning privavy, surveillance, speech, and allocation of benefits, settled.
This sounds much like someone prefering to shut down such debate.
By contrast, prolonging and deliberately promoting uncertainty, as in the cases of leaded painr]t & petrol, asbestos, tobacco, CFCs, and now CO2 emissions, shows the other face.
This is too limited a view of what is "political" because it completely ignores the context in which it lies. That's the point of the article - that you cannot disentangle human action from societal impact.
So extending your example, by doing work that generates "business value" in a corporation as an at will employee, you're implicitly supporting the political structure of hierarchical capital ownership of productive labor. This would be in opposition to, say working only in a labor owned cooperative. (note that is not to favor one or the other, simply to make the point that there is a choice)
People generally do not make such a distinction because it doesn't seem like there are options to choose from, or more likely they are simply ignorant of other ways of cooperation/organization.
Not everyone outside your political cease-fire zone will agree that the work you're doing is apolitical.
If one has an agreement (formal or tacit) to not talk about politics with your colleagues while the team manufactures Nazi flags... that may make for a pleasant and productive workplace, but broader society is still going to have a political reaction to that work.
The key is to understand the difference between contexts. Defining the context of your workplace is fine, but it's a mistake to assume that all contexts should treat you and your work the same.
This mistake is where you get comments like "but Facebook is such an inclusive company and everyone who works there are good people." That can be 100% true, but still not invalidate the criticisms of how Facebook's products are affecting society as a whole. And if someone at Facebook is going to listen to and consider that criticism, that will by necessity introduce the external context into the internal context.
You're suggesting people should be robots, with the only purpose of serving their company and the capitalist system, and restrain from showing or debating any personal issues they find important.
I think this approximates dictatorship and slavery. We shouldn't be scared or uncomfortable discussing politics, we should only try to remain calm and rational about it.
I think this concept is more accepted in Europe than in the US. There seems to be less taboo subjects in Europe.
It isn't that politics doesn't exist, it's that politics and engineering are two different things.
Nobody designs an infrastructure to change your desire such that you desire a red Popsicle and not a green Popsicle. They design a generic infrastructure that can influence your desires and the specific desires are fungible parameters. It's equally capable of making you want red over green or green over red, or want to conserve water, or stay home from the polls on election day, or drink more Ovaltine. The engineering portion is the same in each case. It's a general-purpose tool.
The politics is in what you use it for. But that's the same as anything. A wall or fence with the same engineering specs has very different consequences when it's used to corral livestock than when it's used to imprison people.
The issue is that some companies (e.g. Facebook) are not just doing the engineering, they're also making the political decisions. They're not just creating a general-purpose technology, they're also deploying it in a specific way. And making the decision for everyone because they don't have enough competition.
> It's a general-purpose tool. The politics is in what you use it for.
That's exactly the view they are arguing against. There is no hard border between tool and decision to use it, the infrastructure will always shift the context of what's possible and alter society.
Once nuclear weapons become possible, a coldwar and arms race becomes imminent. Once a massive, uncensorable communication network becomes pervasive, the old social institutions that guide public communication become irrelevant and we build new ones or face massive misinformation and propaganda campaigns. It's not that we choose to use social media to propagate fake news, it's the inevitable political consequence of their technical structure.
This is not an argument against building things. But an argument for taking responsibility for what you build and acknowledging that building technology is a political move, it strongly affects the inner life of the polis.
If a smith makes a hammer for nails, makes the blueprint available for everyone and someone uses that hammer model to kill someone, is the smith evil? Should the smith no longer make the blueprint available because hammers can be used for evil?
Has the smith made a political move by making the hammer available to everyone? Because all the smith wanted to do is share his knowledge to everyone, so everyone can smith their own hammers.
Ultimately the smith is not morally responsible for what people do with the hammer IMO. The entire intent behind all actions of the smith was to do good, can we call him evil for that?
Similarly I don't think simply building tools is a political or moral move. Rather, it's what you encourage people to do with it and what people do with it that is politically and moral. If I encourage my tool to be used for the better of society, am I evil is a small number of people abuse it?
Don't get me wrong, I still think SV is a den of abuse, however that is because social networks like Twitter and Facebook aren't the hammer. They're not neutral tools. Both profit from moral abuse of the platform, it is enabled and encouraged to do so. Google as a search engine is making it's tool with the only intend to manipulate and abuse users for ads.
There is definitely tools that exist in a moral vacuum (just think of a selfhosted radioshow software, it can host a show about pro-LGBT and one about 9/11 conspiracies, should the author be responsible for either?). But social networks and the tools you mention are largely not it. Those tools have been made with the purpose to do evil.
In this example, they hire smiths to do evil and the smiths comply to make tools for evil. You can equate Google and Friends to the Smiths since the outcome is the same.
Even airborne pathogens could be used for good, the HIV-based one could be reengineered to transport a HIV cure while only showing mild symptoms of the common cold. Or cure cancer with it.
Technology is not power. It enables power. If you engineer the technology to enable only a specific kind of power, then you the engineer and the technology you made can be called evil.
Otherwise I would like you to point me to the particle of malice in the hammer and pathogen. The atom of injustice and evil that it's made of.
> Technology is not power. It enables power. If you engineer the technology to enable only a specific kind of power, then you the engineer and the technology you made can be called evil.
That's a clear contradiction. One one hand you acknowledge the choice of the technologist to be a power of evil, on the other you surmise that, as long as they are willfully ignorant to how the tech is used, they are merely a conduit of political power. It sounds like denial of an uncomfortable truth.
Not necessarily. I acknowledge that a technologist with the goal of doing evil can use technology to achieve that goal.
But a technologist with the goal to do good or neutral will be able to use the same technology to achieve the goal.
Technology itself is ignorant of how it is used, it's not a human. The tools, the technology, remains ignorant of how it's used even if it's for evil because ultimately it's the human wielding the tool that does evil.
You may call technology only made for evil purposes evil if you want (I did say "can be called evil", not "must be called evil").
Or otherwise, did the technology choose to be evil? Was it asked and did it consent to be evil? Where is it's atom of injustice in the tool? Or for software, where is the bit that is evil?
If I make a thousand hammers and use one to murder, are all hammers evil?
I find it dangerous to use chains to bind evil, consider a doctor saving a life and then the patient murders someone. Is the doctor evil? Are the tools of the doctor evil? Should doctors consider if a patient might do evil?
The same process should be applied to tools as well. If a tool is used to save a life and then a life is taken, is the tool evil? What if it doesn't save a life beforehand, is is now evil? How much live must it save before doing evil to be neutral?
That is why I consider technology neutral (even if you may call it evil it remains neutral in nature, not in use), once you allow technology to become moral and political, you open the can of worms that is the consideration if evil is what enables evil in any possible form or shape.
> There is no hard border between tool and decision to use it, the infrastructure will always shift the context of what's possible and alter society.
Technology always changes the landscape, but that isn't the thing anybody ever complains about. Nuclear weapons are only terrible when they're actually used against a city, not when they cause a cold war instead of a kinetic one. Which is the political decision.
When you build a system to affect mass desires and use it to get people to conserve water during a drought, nobody calls you a monster who is destroying society. You'll find serious people arguing that it's irresponsible not to do that.
People don't object to the tools, they object to the uses. Especially when somebody else uses them in a way that disadvantages the allegedly aggrieved party's political goals.
> Once a massive, uncensorable communication network becomes pervasive, the old social institutions that guide public communication become irrelevant and we build new ones or face massive misinformation and propaganda campaigns. It's not that we choose to use social media to propagate fake news, it's the inevitable political consequence of their technical structure.
Fake news isn't a controversy because it's a new thing. The "old social institutions that guide public communication" are the things that brought us three centuries of racist propaganda, gave birth to "yellow journalism", celebrated every desecration of the justice system in the Drug War, swallowed the party line on the War in Iraq and flamed outrage for the sake of ratings so hard that the current President of the United States is Donald Trump.
The reason fake news is now a big controversy is that now there are a hundred cell phone videos of the event in question, ordinary people have easier access to primary sources and the barrier is much lower for someone who objects to the prevailing narrative to find an audience. So when the latest line of bull makes the rounds, it's more often followed by an angry mob decrying its falsity to anyone who will listen and backing their objections with evidence. And then it seems like there are a lot more lies, not because there are actually more lies but because there are all the same lies and many more people pointing them out.
The people to watch out for aren't the people who create this kind of technology. It's the people who think only they should be in control of how people can use it. Which includes both Facebook and the people who think Facebook should be regulated instead of smashed into a million tiny pieces.
> Once a massive, uncensorable communication network becomes pervasive
> the old social institutions that guide public communication become irrelevant and we build new ones or face massive misinformation and propaganda campaigns
You're basically saying state run propaganda is the one you want... Which is "fine", but still propaganda and misinformation, just of a different nature (apparently you agree with that one)
Like a communist who calls genetics a "bourgeois science", or a libertarian who calls global warming a "government conspiracy": the point is to apply an ideology to an uncomfortable reality, and get labels that terminate any further thought.
I'm basically saying, there is no "fake news" it's all "fake" or white washed with the politics of the presenter.
By making the claim it's a technical innovation that led to the political problem - it's ignoring the fact it's a political problem, caused by a political problem (that everyone is biased by politics).
I think it's naive to think technical innovations cause the political issues. Even from the cold war example - it's politics that caused it, just the technology raised the stakes. Politics created the issue, not the engineering
I certainly did not say that official state propaganda is better than online propaganda by other political actors (including foreign states). It's that modern society has developed institutions to deal with the innate tendency of political actors to control the public discourse. The free press, democracy and public accountability of leaders evolved in a different technological context, and the existence of those institutions was sometimes defended with blood.
When you disrupt the economic basis of free press, you create a fundamental vulnerability, at least on the short run until those institutions adapt, like an organism exposed to a new mutation of a pathogen. An it might well take blood again to recalibrate. As a technologist, you don't get to ignore the political effects of what you create, you can't say "well, people have been killing themselves on this planet for millions of years, I've merely created a device to automate killing a whole country, a neutral technology." It's not neutral, it has imediate political effects, it gives power to those who have it to the detriment of the rest.
There is no debate technology can change the world for the better, you just can't presume that any technology, introduced at any moment of time, to any audience will be by definition positive. And once you start to nuance that position of absolute amorality for technology, you are engaging in politics. Technology is always power and power is always political.
As the joke goes: "No self-respecting engineer would build a system for defeating emission tests, they'd build a generic system for defeating any test and create an instance specialized for emissions tests"
I think that's fair. Personally, for a long time I didn't want politics to exist. I wasn't good at it, thought it was a waste of time and resources, and hoped to ignore it. But that clearly didn't work. E.g., the innocent BBS nerd havens of my youth turned into an online communication system that has let terrible people organize around the world.
IMO the prior you was correct. We don't even think about all of the good these tools have done because we take it so for granted. The fact that these tools were built, without reference or consideration for politics is one of the greatest things ever to happen in human history. The fact that the internet is open, free, and (somewhat) decentralized would never have happened had politics been allowed to worm its way into the design process, and that would have been a tragedy of unspeakable proportions.
That's a bold and sweeping assertion for which you offer no proof whatsoever.
I think it's also incorrect in a few ways. One, the beginnings of the Internet were in government and academia. The former is definitionally political, and any academic will tell you how political the latter is. The notion that technology is somehow beyond politics is a very political idea, as is the centering of individual freedom as primary. Your refusal to acknowledge those politics is also a political position.
I personally agree with a lot of the politics embedded in the design of the Internet. But we shouldn't pretend like the politics doesn't exist.
> That's a bold and sweeping assertion for which you offer no proof whatsoever.
It's self-evident and you aren't providing evidence, you're arguing definitions. The internet is politically neutral. It is designed to carry data, without respect for its contents, point of origin, or destination. All participants in the network are treated equally, all forms of communication are treated equally.
Now, you can say "neutrality is a political statement". And if you want to define it that way, then sure, it's political. But I think that's a fairly silly, meaningless definition.
If it's evident to you but not evident to me, then it's self-evidently not self-evident.
The internet is not politically neutral. Carrying data without respect for contents, origin, or destination is a very political ideal. It's also an ideal that the internet has never met. Origin and destination have to have money to pay to connect, for example. And ISPs from the earliest days had acceptable use policies, so it was never "anything goes"; determining what was acceptable was very much a political process.
I note that the Bitcoin community, who are even stronger on the "without respect for contents, origin, or destination" philosophy, are even more obviously political. In pretty much any Bitcoin discussion here you can find people who espouse extreme anarchocapitalist values. Many are quite clear that any state interference in their business is anathema. Again, highly political.
What you're espousing is the sort of faux neutrality that the article is about.
This boils down to you defining neutrality as political. There is nothing faux about the neutrality of the internet. BGP peering is neutral and free. ISPs charge a fee for access, sure, but that's a choice made at the endpoints, on an individual basis.
What you are arguing is that neutrality is a political choice, and what I am arguing is that that is a bad definition of politics.
No, it's you defining a political viewpoint as neutral.
There are many possible notions of what "neutral" means. That you pick a specific, socially determined definition and treat it as the only possible one is a political action.
Erm, that's what I said? We're arguing definitions, not facts. You are defining neutrality as a political position. I am defining it as non-political.
My point is that your definition is bad, because it is a non-definition. If neutrality is political, then 'political' is a word without a meaning, because it applies to any and all possible things.
Sorry, i'm not sure if i'm understanding correctly. I have two equiprobable interpretations of your question:
1. The internet is not completely decentralized due to the politics of its engineers.
2. The internet being designed to be open and free was, as much as any other architecture, reflective of the politics of its framers.
I suppose they are essentially the same question, though. You're right, the architecture of the internet does reflect the politics of its engineers, to a degree. But I think it mostly represents an attempt at the most flexible, simple architecture to provide infrastructure for other applications without being opinionated about those applications. Which, on the one hand, is sort of a political choice, though I feel it's only political in the sense that it avoids any and all political considerations. Its politics are neutrality and agnosticism.
Not directly. You can call "open internet" politics, sure.
But this is different than the existing politics of taxes, gun rights, abortion rights, property rights, race relations, immigration, transportation, homosexuality, tariffs, or federalization.
You might claim it reflected free speech, i.e. the inability of government to restrict the content of written or spoken messages. But I think that would be a weak argument at best, especially for the pre-https web.
It seems to me that these kinds of authors always see the alternative as a Silicon Valley with their politics. This article describes a bunch of concrete things that tech companies ought to do: ensure the poor aren't marginalized, bring local communities together, and don't provide mechanisms for creating echo chambers. But what neither Turner nor Khan discuss is how tech companies will realize they ought to do this.
What if Amazon politicizes in the direction of libertarianism, and decides its only social responsibility is to increase the world's GDP?
What if Facebook politicizes in the direction of social justice, and determines that segregating people into identity-based safe spaces is the way to go?
What if Google politicizes in the direction of some political party, and decides it's duty-bound to tweak its search algorithm to hurt opposing candidates?
Yeah, I think there's a strand of political thought which assumes that no-one can honestly come to a different political conclusion than them and it's everywhere. It even gets applied to stuff like TV programs and other works of fiction. If you don't politicize everything according to their politics, it's because you're just too clueless to understand that everything has to be political.
I think that is basically what has happened, only it is one step removed from the corporations listed. For example Cambridge Analytica leveraged Facebook to further their political agenda. US government leveraged all the above to setup a surveillance state unrivalled in it's scope (bar maybe by the Chinese). The web was supposed to be a liberator allowing the marginalised to connect and flourish. This happened. Now nerds are cool but no one ever considered that racist misogynist scum are also a marginalised group.
Personally I see the alternative as a Silicon Valley which accepts that its output can never be "neutral" or "apolitical", no matter how many claims are made to the contrary, and which gives consideration to that before building things, allowing that consideration to influence the design.
Any time you're building technology with (or with the intention of) mass reach, "if the person I consider to be the most evil in the world were to commandeer this, what's the absolute worst they could manage to do with it" is a useful question to ask. If some companies had asked this question far earlier in their lifecycles, we might see a very different world around us today.
In a worst case scenario, at least the people at those companies are somewhat intelligent. All other things being equal I would prefer an oligarchy of people who are intelligent over one composed mostly of idiots.
He's talking about the neutrality vs agency of technology, which is largely unrelated to most of the political articles on HN. For context, here is a good thread with a bunch of Ph.D. researchers linking to their favorite academic resources on this topic:
Anyway this is what Turner is talking about when he's talking about algorithmic bias and actor-network theory. It has nothing to do with random articles about Trump's immigration policies or affirmative action or whatever, which is most of what gets flagged on HN.
I dunno, I've seen a few HN posts about, for example, the Facebook news feed and its connections to the propagation of fake news. That's absolutely connected to the "neutrality" of the news feed (it's an algorithm! Algorithms can't be bad, they're just math!), but will be flagged all the same. It's a shame for many reasons, but not least that it's a fascinating topic.
There have been quite a few major discussions of that. I think people start to flag such a topic when it enters a later stage of repetitiveness. Call it the Snowden effect.
There's politics, and then there's politics. The topics that get flagged usually are ones for which it's near-impossible to have a productive discussion. The HN guidelines are spot-on with "If they'd cover it on TV news, it's probably off-topic." The important topics usually filter through here. If you want to indulge in shouting at each other over what outrageous thing some political figure did or did not say this week, there are plenty of other venues on the Internet which encourage just that.
I've seen, and very occaisionally moderated, productive discussions on difficult topics. It's not easy, or pleasant. It requires firm rules for participation, and a steel will on the part of the moderator. People will be (or act) hurt. And in any sizeable discussion, many comments will be moderated out. Preemptive moderation (hold for publishing) may be necessary.
This isn't something HN are likely to entertain frequently, but I can think of a few topics for which I'd like to see the attempt made, perhaps every few months, on a scheduled basis.
I have no problem with this rule. This isn't SV's domain expertise.
Maybe if political/social science is included in engineering schools as Fred Turner suggests, the quality of discussion and outcomes produced could be more constructive over the long term.
But with the current generation or two or three forget about it.
I think you have the causality backwards. If we never talk about it, the quality of discussion isn't going to get better. And as the article says, "The design of technology hides its political imperatives by presenting as neutral." Us not discussing politics here is part of that pretense that what we do is somehow neutral. The pretense that things that touch millions, maybe billions of people could possibly be apolitical.
The only choice in the current high noise/low signal environment is to pretend to be neutral or pretend to know what you are talking. Both Obama and Trump fall in the later category.
You and everyone else who believes talking is a route to solutions is delusional. Especially in the current environment of self reinforcing echo chambers and us vs them narratives. There has not been a single major issue solved in the last twenty years of social media and 24*7 news "educating and informing" the public.
Will technocrats who are political have any success? That experiment is currently running in China and the default American narrative is it will fail. So either way you look at it an SV entry into politics will fail.
What they can do is change the environment. Remove the retarded mechanisms that increase "engagement" at the cost of dividing people. This is much more in the realm of their domain expertise than thinking they need to spend their time talking or propping up talkers more.
http://opentranscripts.org/transcript/no-neutral-ground-burn...
> I know that you didn’t ask for this job. You didn’t ask for this role in society. None of you, not one of you, wants to think about the many people that can be affected by one fucking perfectly normal bug or mistake in the technology that you built. And this is one of the reasons we keep our heads down. No one became a geek because they wanted to be the center of political attention.
> That just happened.
> You don’t get to choose. You don’t get to choose what era of history you live in and what that era wants to do with you. And this is a moment when it’s all up for grabs. That’s what it means to say we’re on a burning planet. And what it means to say that we don’t have neutral ground is that you’re at the center of that fire. You set it. You’re one of the people that set it. You’re one of the people that tend it. And everything you do, the changes you make over the next months and years are going to chime down decades and centuries and shape the lives of people you will never know [...]