Hacker News new | past | comments | ask | show | jobs | submit login
Leaked transcript of censored Bret Victor talk (alarmingdevelopment.org)
128 points by jashkenas on July 31, 2013 | hide | past | favorite | 76 comments



The comments on this Hacker News page are a bit odd. The linked article is clearly offered as a bit of playful satire. In the same way that you are not suppose to read "A Modest Proposal" and conclude that we should all eat babies, likewise, you are not suppose to read this and believe these are Bret Victor's thoughts.


OT, but the casual reference to an obscure 18th century article by Jonathan Swift might make this comment indecipherable without Google for most people.


As a counterpoint:

For my social circle it's not remotely in the obscure category. I suspect that none of my friends or coworkers would have to look it up on Google...

This is what a liberal arts education is for.

--

Of course I realize that not everybody is lucky enough/even wants to get a liberal arts education, but the social/educational structure of America is not something I'm looking to attack this comment.


I thought this was part of a standard HS education? That is at least where I first encountered it.

I also have a liberal arts education, but those fruitful years were reserved for prevaricating on and dissecting various obscure and convoluted logical arguments related to phenomenology.


In the United States, most public education is administered at the town level. Because towns are so small compared to counties/states/nations, the variance among towns in parameters is very high -- two towns only 15 minutes apart can easily vary twentyfold along parameters like average income and percentage of adults with graduate degrees.

Thus, what passes for a "standard high school education" in the United States varies wildly -- in some schools, you can graduate without even being able to read and comprehend a magazine article. In others, the top graduates often find Harvard less intellectually rigorous than their high school (because Harvard has to help their students from worse high schools catch up).


US education is inconsistent. At my school, we never had to read Swift, or many of the other books that seem to be considered canonical "high school" literature.


>I thought this was part of a standard HS education? That is at least where I first encountered it.

If you assume everybody on HN is an American, maybe.


> If you assume everybody on HN is an American, maybe.

I also attended a French school, where it was covered, and I've had at least one conversation with my Norwegian family in which it came up.


I got to read it in Swedish upper secondary school.


This is why even CS students should study humanities. "A Modest Proposal" is the opposite of obscure; it's one of the most famous pieces of satire ever written.


The essay itself may not be widely read, but the phrase is anything but obscure—it's closer to cliché.


I don't feel putting words in people's mouth is a good way to make a point, especially in a way that could be misleading at a casual glance. If you disagree with Bret, surely there is a better, more constructive way to explain your position.

(That is, of course, based on the overwhelming likelihood that these aren't actually Bret Victor's words.)


First I'm not sure if the words the author is putting in Brett's mouth are supposed to be ironic (i.e. the author argues against that line of reasoning) or not. I'm going to take them at face value.

The post conveys the false idea that mathematical purity/simplicity and use simplicity are at opposite ends of an imaginary spectrum. If anything, the optimal is somewhere at their intersection.

That is, the ideal tool should be mathematically simple and sound, while still being intuitive.

We should also admit that we do not know of a good, reliable method to write good programs; and this fact is completely independent of the programming language used. Languages have different properties. Some are beginner-friendly, others scale well to large system, some both; but in all cases, what's valuable is the thinking that leads to the program. In another commenter's words: formalizing murky idea.


For any detractors of Excel and Access, I've news for you: "Software is eating up programmers"

Programming as a fundamental activity, simply doesn't scale. The level of training required to get competent requires many years of training, while software is able to "download" a braindump of the best practices. For instance, Excel is used for a lot of business analysis. Business analysis, you say? How is 65535 rows sufficient? Excel, for instance has an engine that can process large amounts of data that's hosted externally using a vertical columnar compression. While a competent programmer with years of training may be able to do this themselves, they will not be able to beat on price when a person armed with half an hours tutorial can perform the same task on Excel.

Yes, there are monstrosities and errors http://www.businessweek.com/articles/2013-04-18/faq-reinhart... with Excel. The question is whether these errors are any easier to discover in a computer program.

The reason we aren't seeing more of these software is because programmer-types don't think very hard about non-programmer's programming problems. I hope with Jon and Brett's work, we push the boundaries further, and make computer programs easier to modify and easier to reason (note: not just easier to write). In both Brett's and Jon's works, you don't avoid writing code. You just find it easier to figure out what effect a code change has.

I saw the comment about using a traffic cone as a mutex. Again, this is a symptom of the dismal state of tools rather than the idea behind the tools themselves. VisualSourceSafe still performs pessimistic check outs. Imagine if the entire sourcecode of a project is in one single file, then better tools would evolve to find code, and allow many people modify it at the same time. We already know how to lock a database, provide MVCC. One day, programming will be very similar.


"I could tell you more stories from the Golden Age: Spreadsheets, HyperCard, Delphi, Visual Basic. These were all enormous successes, allowing normal people to get shit done. They are despised by all true hackers. Because normal people can use them to get shit done. That is our greatest fear."

I'm actually a lot more frightened to be called for the cleanup...


> These were all enormous successes, allowing normal people to get shit done. They are despised by all true hackers. Because normal people can use them to get shit done. That is our greatest fear.

No; I have to object to this specifically. Normal people can't "get shit done." Not really, not successfully. People, when you let them build their own tools in Excel and Access and what-have-you, end up with rosters full of incorrect and incomplete and invalid information, things split into so many different places that nobody can find anything any more, and interfaces that require inexplicable cargo-cult rituals and avoiding otherwise valid input states to use. Their stuff works 95% of the time; they just aren't used to a world where "5% failure rate" means "silently and consistently eats any customer's file if their name has a ç in it" instead of just "has to be restarted every hour."

The thing programmers do--it isn't using arcane languages, recognizing mysterious error codes, memorizing APIs or libraries. We aren't here just because the difficulty of typing "/[A-Z]+?/" grants us job security. The thing programmers do--or more generally, the thing Engineers do--that other humans need us for, that machines can't do for us (yet), is to formalize murky ideas.

People who don't have training in Engineering have no fucking idea how to go about doing this. You know the old joke of there being a button labelled "Do What I Mean"? It's a joke because 99% of the people who would press that button don't know what they mean; there would be no coherent thing for the button to do--even if it could read their thoughts--but to interrogate them for hours to get them to decide what they really want.

The few of us--the Engineers--we can sit down, and without any further prodding, think out what we want to have happen. Then it's a simple matter of just writing it down. The compiler, the language, none of those are really problems, compared to knowing what you want to happen.

Note that Bret's talks and essays are focused, by-and-large, on the iterative rapid-prototyping model: using computers as tools to help us explore options, so that we can more quickly figure out "what we want." But even if you know that you want a Sudoku solver, you can't iterate out a Sudoku solver. You have to know that what you want to do is to put these constraints, in this order, on these numbers--and that's an algorithm. People--not us Engineers, but people--they don't understand algorithms. You need an Engineer to take the statement "I want a Sudoku solver" and formalize that into "I want to use this algorithm." Just like you need an Engineer to take the statement "I want this bridge-support anchored to the riverbed here" and translate that into materials and rigging that will take the tensions and stresses without shearing.

Neither COBOL, nor SQL, nor HyperCard, nor AppleScript, nor Inform, nor any other "human-friendly" language, ever served to allow anyone to express a clearly-defined thought they wouldn't have been able to express in a language with more "punctuation-y" syntax. Learning what "<=" or "&&" represent is a small, fixed cost at the beginning of attempting to solve problems. Learning what they mean--what the formalism is there, and what its consequences are--takes years, and requires that you think like an Engineer. That step is required whether you are using C, COBOL, LabView or Prolog.

To get shit done, you must first know what done shit looks like; perfect, robust, unbreakably done shit. And only Engineers really do. Just like how people who aren't artists, if asked to draw someone's face, will end up drawing the abstraction of a face; people who aren't Engineers, if asked to formalize a system, will end up describing the vague, hard-AI-complete, "and then it just does what you'd expect okay!?" system that, in practice, isn't a system at all.


> The thing programmers do...

The thing that programmers do is the exact same thing that your post says "normal people" do. Programmers build buggy systems that lose data because someone's name has a ç in it and fail 5% of the time (or more) and rarely do they understand algorithms (and especially not the complexity characteristics) and very often they build the wrong thing, even if they get it done and few have a formal education in anything resembling engineering and very often programmers are indeed getting shit done, but it just turns out that what they got done was just shit.

We have to stop putting programmers on a pedestal. There are great programmers and there are bad programmers. Just because someone slings some code doesn't make them a bastion of clear engineering practice. Many programmers that I've met would benefit greatly from the kinds of systems that Victor describes.


Let me first say, I agree with every individual statement you made, here.

However, let me put my argument another way: the ability to work with you to formalize your idea is what you're paying for when you hire a programmer. You're paying for a human compiler[1], a fully-intelligent REPL where you can give vague commands like "make me the next Facebook" and, through the programmer's Engineering knowledge, they will ask you questions and force you to make choices, until they've turned that informal idea into an actual capital-S System. Inasmuch as they have that Engineering knowledge, the resulting System will be a formalization of your own desires. And then, having the formal System, the programmer can go and implement it. That part is comparatively trivial, and increasingly done "by" the software compiler. Victor is noble for driving us further toward trivializing that part of the process, but it is only part of the process.

But anyway, back to your points:

Most "programmers" aren't programmers. If you're only as good at programming as a member of the general population, then you don't get a special job title, right? Otherwise, I would be right now a writer, actor, landscaper, game-designer, philosopher and life-coach, as well as a programmer. But in only one of those fields could I actually make more money than some schmoe who just decided to jump into the field last week, and the reason I'm "more of a programmer" than Bob-the-Accountant is a programmer, is that I can do the Engineering part better. If Bob-the-Accountant started calling himself a programmer, he'd be a white belt programmer--someone who just joined the Art, and must still unlearn their preconceived knowledge before they may begin--while my belt, at least, would have some color on it. Plenty of people calling themselves programmers are white-belts. That doesn't mean we should consider them when we speak of "the thing that programmers do." It would be like including people who write their own legal contracts when speaking about "the thing that lawyers do."

So yes, if we're going to keep using the word "programmers" to refer to both the Engineer journeymen and the white-belts who produce misfeasance with their every step, then we should stop putting "programmers" on a pedestal. After all, what use is a pedestal where the people on it are exactly as high up as the general population around them?

---

[1] https://news.ycombinator.com/item?id=6131742


> Most "programmers" aren't programmers.

So if you're paid to write programs you probably aren't a professional programmer? What's wrong with an operational definition?


Another way to say it: if we had any way to quantitatively measure the performance of programmers over time, most "programmers" wouldn't be programmers any more.

There's a field with a similar problem to our own: public-school teaching. Teacher performance isn't quantified, and so teachers can pretty much get away with only being as good at imparting knowledge as any random member of the population--even though they took years of education-in-Education. Most "teachers" aren't teachers, any more than I'm a teacher; they're simply people paid to repeatedly attempt (and fail) to be teachers.†

We pay a lot of people to repeatedly attempt (and fail) to be programmers.

---

† Teachers' unions are right now fighting the introduction of actual metrics on how a student's achievement level's year-over-year velocity is affected by a teacher relative to the average of their peer-teachers in the same school. They are fighting this because the data clearly shows a bimodal distribution, where a lot of people are just extremely unfit to be teachers--their students actually reaching negative learning velocities by their presence in the class--and until now, the unions protected these people, since there was nothing to prove they sucked so very much. It's hard to go back on your decision to defend someone when you later find out that they're indefensible.


Your opinions on teachers are really easy to nod your head along with, until you actually talk to some teachers about the effects that they are able to observe, directly, of the in-vogue obsession with "actual metrics" (standardized tests) on the actual, ya know, education of their students. I'm sure many teachers disfavor more testing because they know they fear their unfitness, but I think most of them disfavor more testing because they feel we are already testing too damn much, thus trading off a tangible negative short-term effect for a speculative positive long-term effect. Every hour a student spends taking a test is an hour not spent learning.

(Apologies for the totally off-topic response to your only mostly off-topic comment.)


That's honestly a very good point; so much of our public school systems are already broken due to testing. Students are mostly taught the requirements of proficiency tests solely for funding and more occupational-related reasons. I've heard many teachers complain that they aren't truly imparting knowledge but merely repeating from a book due to "the state."


> Teachers' unions are right now fighting the introduction of actual metrics

We would unionize and be fighting metrics too, if they tried to measure how good a programmer you were with the number of goto's you use.


I thought you were doing well until you started in on teachers, you should have left that out because it's actually a more complex problem than you make it out to be.

I like where you were going on programming though.


Here I thought the analogy worked well for that exact reason. They're both very complex problems. That's why both fields are still in the state that they're in.


Yes, exactly--if assessing teacher performance was simple, then we'd be doing it, wouldn't we? :)


Could you share a link to the student achievement metrics pertaining to the average of peer teachers in the same school?


I believe we have an open and shut case of the No True Programmer fallacy.


The No True Scotsman fallacy is a fallacy because "Scotsman" is an artificial category. There is no phenomenological consequence of being a "Scotsman"; therefore, its properties can be assigned arbitrarily, based on first assigning someone to the set (by marriage, say), and then remarking on the thing all the members of the new, expanded set have in common. (A clear parallel would be a "No True HN Member" fallacy.)

However, if you believe that there is any empirically-detectable property that makes a someone a better programmer when they have more of it, then "programmer" is a natural category, not an artificial one. It's something where, if we washed away the word for it, we'd end up re-creating the word, as a handle to describe that obvious cluster of things which are unlike other things but like one-another. Being a programmer has phenomenological consequences--you can determine who is or isn't a programmer using games or tests which don't have anything to do with programming trivia, and without mentioning that "potential for ability to program" is what you're testing for.

In any natural category, you'll have false negatives and false positives: things that are identified as X but don't have the property that puts them in the X natural-category, and things that aren't identified as X, but which do have the property.

There are many False Programmers. There are also False Not-Programmers: people who don't think, or know, that they're programmers, but who are nevertheless. This is true of every natural category. There are people who think they can sing but can't, and people who think they can't sing but can.

There are professional singers who can't sing, even though they "are" singing. When we say "can sing", we imply the edifice of a market, and competition; we really mean "can sing to a level where we'd pay them more for their singing than a randomly-selected member of the population." In other words, they "can sing objectively-well."

There are people who "can program objectively-well." They are, in the terms of the natural category, both the True Programmers, and the False Not-Programmers. There is no fallacy at work.


Sorry, what is your test for being a programmer? I'm quite curious to hear what more is required beyond writing software, or even how somebody can be a programmer without having written a line of software in their lives.


Writing software well (or even reading well existing software ... which is a must, when someone needs to learn, or just maintain old code), seems to require a combination of cognitive skills. That combination can be detected, even when a person has no previous programming experience: http://www.eis.mdx.ac.uk/research/PhDArea/saeed/ .

From the linked homepage:

    >> We (Saeed Dehnadi, Richard Bornat) have discovered a test 
    >> which divides programming sheep from non-programming goats. 
    >> This test predicts ability to program with very high accuracy 
    >> before the subjects have ever seen a program or a programming language.
Edit: added the note about reading software.


>No; I have to object to this specifically. Normal people can't "get shit done." Not really, not successfully. People, when you let them build their own tools in Excel and Access and what-have-you, end up with rosters full of incorrect and incomplete and invalid information, things split into so many different places that nobody can find anything any more, and interfaces that require inexplicable cargo-cult rituals and avoiding otherwise valid input states to use. Their stuff works 95% of the time--they just aren't used to a world where "5% failure rate" means "an error every ten milliseconds."

You just made his point.

All the downsides you mentioned don't matter in the real world and for those people. They are only appreciated by programmers like you and me (and mostly the kind with a slight OCD).

But the upsides TFA mentions are very real: stuff that took them days, now takes minutes or hours. They could not give a rat's ass if it's not DRY, if it doesn't handle corner cases, if it expands in 20 ifs, when a range check would suffice, etc.


> They could not give a rat's ass if it's not DRY, if it doesn't handle corner cases, if it expands in 20 ifs, when a range check would suffice, etc.

I'm not talking about any of that.

I'm talking about things like actively losing track of customers because they were saved to a separate file that got saved over on a network share. I'm talking about billing people two or three times because there isn't a single place to check to see if they've already been billed. I'm talking about being on the phone with a customer service representative who can't authorize anything because your account is in an indeterminate state, and they have to check with management--and your account will never get out of that indeterminate state, so every five-minute conversation you ever have with them will become two days long, as you wait for them to call back the next day. I'm talking about going back to paper because half the time the information just isn't in the database, or is too wrong to rely on. I'm talking about having to hire clerks just to manually print data out of one system and type it into another, because Bob from Accounting "got shit done" without having ever heard of this thing called "networking."

People trying to automate things, without first having a formalized understanding of what the process is that they want automated, will cause business-impacting failures. There's never been a single time where it hasn't, in my years of dealing with this as an employee, a contractor, or a consultant.


>I'm talking about things like actively losing track of customers because they were saved to a separate file that got saved over on a network share. I'm talking about billing people two or three times because there isn't a single place to check to see if they've already been billed. I'm talking about being on the phone with a customer service representative who can't authorize anything because your account is in an indeterminate state, and they have to check with management

Ah, those things weren't done by the kind of Excel-wielding people the article talks about.

Those mistakes were done by programmers proper. With CS degress and everything.

It's not the small mom & pop or mid-sized company that usually bills people two or three times -- those knew how to bill even before excel.

More often than not, it's the multi-million enterprise crap large corporations use, with 400 options and convoluted procedures. I mean I've been double billed by the utility (electricity) company, and that's surely not due to the bill being in any Excel file.


I think you'll find that 'large corporations' are awash in Excel monstrosities.

I've worked at a few unnamed places, medium to huge, that used Excel and Access in horrifying ways. One place, with 60000 employees, had an 'editing cone' which was an actual traffic cone that you had to have in your cube if you were writing to the Access file on the SMB drive. During my time there, one person ran their Excel script sans cone and a bunch of people didn't have to pay their bill that month.


One place, with 60000 employees, had an 'editing cone' which was an actual traffic cone that you had to have in your cube if you were writing to the Access file on the SMB drive

I LOVE this image. Acquiring a physical lock on the file. Think of the manager who thought of this beautiful idea, probably not a programmer by training. Awesome.

And yes, since it relies on human conformance, it's bound to fail on occasion. You can say the same thing about any piece of software you ever came across or wrote.


I can't work right now, Tim has the cone.

(this was a pretty great idea from the manager though)


Not in my experience; those were all real cases I mentioned, with real people using combinations of Excel, Access, and SMB shares to "simulate" having actual software talking to an actual database. Usually, in fact, right beside the actual software that talks to an actual database--because someone figured they'd throw something together to hold "an item or two of ancillary data" instead of getting that software modified to include that data--and then their ancillary data-store grew and grew...


At least one regional electric utility uses an Excel macro to scrape its own public web site to gather aggregate usage and price data. This method allows them to save the data, and maybe analyze it later.

Just... absorb that for a few seconds.


"Neither COBOL, nor SQL, nor HyperCard, nor AppleScript, nor Inform, nor any other "human-friendly" language, ever served to allow anyone to express a clearly-defined thought they wouldn't have been able to express in a language with more "punctuation-y" syntax."

The chap on the next row of desks runs 300+ full time students (course success tracking, attendance, nett funding per student &c) on a fairly large Excel spreadsheet. He owns it. He knows it. His colleagues feel that they can edit the data for their students over the shared drive. Works for them.

In the UK the funding methodology for Skills for Life funding draw down is, shall we say, complex. Median funding is £3k per student, so 300+ of those is not far short of a million. The students in question are also subject to monitoring by three other agencies, with overlapping data requirements. The cost to produce an application that embodied the business logic for this edge case provision would, I imagine, be quite high.

It works for us. If it breaks, it can be fixed.


   In a previous paper, Chlond (2005) presented the formulation of    
   Sudoku as an integer program. Chlond claims that a spreadsheet 
   formulation is not straightforward but we present a simple Excel 
   formulation. In light of the current use of Excel in the 
   classroom it can be highly instructive to formulate the problem  
   using Excel. In addition, the formulation of this relatively 
   simple model enables instructors to introduce students to the 
   capability of programming Solver using VBA. In a follow-on paper 
   (Rasmussen and Weiss, 2007), we demonstrate advanced lessons 
   that can be learned from using Premium Solver's powerful 
   features to model Sudoku.
http://archive.ite.journal.informs.org/Vol7No2/WeissRasmusse...


> there would be no coherent thing for the button to do--even if it could read their thoughts--but to interrogate them for hours to get them to decide what they really want.

Which is funny, because that's almost exactly what Bret suggested as a better way for less-technical people to create information-display software in an essay from '06:

http://worrydream.com/MagicInk/#designing_a_design_tool

In short, to take examples given by the user and extrapolate them, letting the computer do the formalizing, and having it ask for clarification on unclear points.

The rest of the essay is absolutely worth a read, by the way.


That is, in fact, in a rather primitive way, what compilers do already; it's just very blunt and baroque, so instead of "please be more specific" you get an error message, and then you have to run it over again to get the next iteration of the feedback loop. The near-term goal in compiler/interpreter design (given still-textual languages) should be more of exactly this kind of interactive communication, where the compiler is "peering" with you to edit your code.

But the full solution requires hard AI, really. "Build me the next Facebook" can't be extrapolated into anything useful unless the software itself can dream of what the "next" Facebook would be like. A human programmer probably already has those dreams on offer.


(Also, I should note, the usual thrust of the "Do What I Mean" button joke is that the person wants to press it because they're tired of having to make choices. What they really are asking for is a Coherent Extrapolated Volition (http://intelligence.org/files/CEV.pdf) button: to cause the machine to, perhaps, simulate a bunch of copies of you, showing you a result for each combinatoric set of responses in parallel, and then pick the result which simulated-you likes most, without having to bother real you with any more questions.)


I counter with the experience of repeatedly encountering plain average admin assistants who have managed to hack together Excel spreadsheets that do exactly what they need to. And all the crappy little bridges that villages whip together from random garbage that they walk on every day for five years. From an engineering viewpoint, these creations are delicate monstrosities, but for the user they are absolutely beloved and insanely efficient. And they are the epitome of highly iterative creations starting with a barely half-baked idea and zero engineering training. Also, have you really convinced yourself that "perfect, robust, unbreakably done shit" actually exists anywhere in the world for someone to view it? Even the space program, in my opinion the pinnacle of human engineering achievement, had several exploded rockets and dead astronauts, and was quite iterative.


I never said that '"perfect, robust, unbreakably done shit" actually exists anywhere in the world'--merely that it can be conceived of, and then strived for. Knowing what it would look like if you saw it, and how it would be different from a kludge, is exactly what makes you an Engineer.

Also, I never said one needs training to be a programmer. As far as I've seen, it's a perfectly natural (or nurtural, whatever) talent, that one then hones over time. The "programmer", selling their work as a programmer, is a false positive: someone who is in the term, but not in the natural category. The admin assistant, serving as their own client and creating a System to suit themselves, is a false negative: someone who isn't, nominally, a programmer, but is in the category.

If you can formalize an idea into something that Works, you are an Engineer. Nobody needs to hand you a certificate; you don't need to call yourself one, or even know you are one; you just are. It's a detectable, testable property of your mental architecture.

The problem is that nobody ever told this to some of the people trying to make their livings as programmers. They're like portrait-artists with dysgraphia, but unlike with that condition, they are the majority of humanity. Actually, let's take that analogy further, it seems sound:

1. Let's say 90% of the population is dysgraphic;

2. but "portrait artist" is a highly-compensated, "in-vogue" field;

3. additionally, the client has no idea how to judge the portrait (maybe an independent 90% of the population is also blind), so any flaws in it won't show up until it gets exhibited several months later;

4. and (okay this is getting a bit ridiculous, but I'll keep on with it) most portaits are the works of several portrait artists, so it's hard to say who caused a given flaw.

If all these things were true, the average portrait-artist's ability would entirely illegible--you couldn't judge them on results, nor on past performance. This would encourage a market for lemons. Additionally, the set of (people with dysgraphia & people willing to lie and say they can paint) would, just by numerical advantage, outweigh "people who can actually do their jobs" in the portraiture market. It would do to be extremely skeptical.

But still, there would be false negatives; people who never even considered portraiture, but aren't dysgraphic. Maybe at one point a friend of your admin-assistant asks them to doodle them for a newsletter, and they produce something that Actually Looks Good. Surprise!

Usually, though, the nose will be on the forehead.


Hmm. No.

Just knowing what you want isn't enough. Language matters because you don't want to solve the same problems over and over again. Have a look at Dan Amelang's work on the NILE renderer, and then see if you can still say that language doesn't matter.

What is the smallest set of orthogonal abstractions that, when combined, yields the explicitly desired behaviors, along with the implicitly necessary ones?


I think you're falling in to the trap of supporting the OP's point by overreacting.

Do you really believe that it _should_ be impossible for an average person who desires a sudoku solver to get one without any engineering knowledge?

The spirit of Bret's talk, and even this response, is of thinking broadly. The question is not "is this possible", but "should this be possible".


I don't believe it's impossible. In fact, there are two clear ways to do it: either

1. have a database of known algorithms, and map-reduce out the ones that produce the most signal for your data-set (this isn't Hard, but it requires a globally-networked language-neutral ABI-neutral algorithm repository and a free-use cloud compute cluster to run the heterogeneous algorithm-tests on), or

2. expect the computer to invent a novel, efficient (or at least polynomial) algorithm in response to your data-set on the fly. This is a Hard problem--since solving it basically means that computers can now take the jobs of Mathematicians in proving novel theorems. I don't think that's "impossible" either--obviously, Mathematicians are performing some describable algorithm in their heads to come up with novel proofs--but it's likely a Big Data problem in the same way most AI problems have turned out to be; not something you can ask your workstation to do.


That is an awesome rant.

People don't know what they don't know. And making things look easy seduces them into thinking that "looking easy" is the same as "is easy." Jon Livesey, an engineer at Sun and later SGI used to quip, "It is easy to say, like 'largest integer' is easy to say, but actually pointing out what that is, now that is a different story entirely."

Engineers recognize when a detail is missing and ask about it. Annoying as hell to people who "just want it done" but essential to the task at hand.


http://www.cnn.com/2013/02/27/opinion/ted-prize-students-tea...

Get out of people's way and stop telling them they're stupid, and they will fucking amaze you.


I'm fine with letting everyone try to program. Everyone should try to paint, write, make movies, play music, all those good things.

But if nobody tells someone they're stupid after they've proved repeatedly to be stupid--if we, as a culture, are too nice to leave bad reviews of bad work; if we overlook that time that Bob's excel sheet cost the company five days of downtime, and how we had to hire three extra interns to do redundant data-entry for it--then Bob might think people should be paying him a professional programmer's salary for his time. Bob might put out his shingle as a programmer. And now, the market has one more lemon.

There are some things which require real, natural (or nurtured-in, at least) talents. Singing, for example. Everything I've experienced in dealing with other programmers tells me that programming skill, in the end, is just an outgrowth of the ability to think logically, systematically, and formally--and that these abilities are part of your mental architecture, and, if they're not determined genetically, at least can only be developed when you're young. By the time someone comes into their first high-school programming class, they already will or won't be an Engineer by mindset, and there's no switch you can flick on them, no number of facts and rules you can teach, to turn that around.

To use a slightly-sour analogy, it's like the cases of children found surviving in the wild, and taken into society. They can learn words, in the same way a chimpanzee does, but they never become able to grasp syntax--their mental architecture has already set, and that component wasn't included. Logical/formal/systems thinking seems to be like this. If anyone needs to teach it, it's parents, and probably around the same age as reading. But we literally don't know what it is that's needed, specifically; what exercise you can do with a kid to "induce" logical thinking. What did we do when we were young? Play with lego? Play pretend?


I'm not completely sure I agree with you yet, but this argument does mesh very well with the trope that so many programmers first started programming when they were very young. I was playing with BASIC at age 5, and anecdotally I've heard very similar stories from other programmers at a rate that exceeds what my expectation would be of a random distribution, and definitely exceeds corresponding anecdotal evidence from other professions (i.e. number of people in profession X who were doing X at a very young age).

The point of possible disagreement is that I'm still inclined to think that any person can learn how to program at any point in time, but that it is just exceedingly uncommon for them to actually do so. Typical CS classes are certainly not going to accomplish it. Like you said, it's about thinking logically, systematically, and formally. I think very few people actually try to learn how to do that late in life. If you don't already have it, you probably don't value it enough to try to get it. It's almost tautological: how would someone who doesn't think rigorously be convinced of the value of thinking rigorously?


This is great stuff, but I'm not sure what he means by "censored from the Internet". The video as posted is the entirety of the talk given at DBX. Was a longer talk given elsewhere?


I'm tempted to say that this is satire of some kind, but I'm not completely sure...


Ironic that the author decries the hacker cult of cleverness by posting a clever satire.


Not very ironic. He doesn't decry cleverness per se, just using cleverness as a barrier from "the multitudes".

(If I'm allowed the hacker cult of pedantry).


Title is misleading. It's not actually a transcript of the talk.


Although HN policy is to change the original post title if it is misleading, adding "[satire]" in this case ruins it. Please allow the smart HN community to discover for themselves what this post is trying to say.


You're right. I put it in because it seemed like so many folks were flagging it down. But hey, let 'em flag.


Why are people upvoting this post? It's attention-baiting and deceptive.


Because its good?


There's a false dichotomy between educated and uneducated which is perpetuated by schools, indeed may be their entire point of existence. There's also a dichotomy between exploring an idea and solving a problem.

Being ignorant is less good than being informed, but informing yourself is expensive. Becoming complacent due to your perceptions of your intelligence is foolish. Not exploring something because it's not in your skillful sweet spot can lead to failure to innovate.

So, there's nothing surprising here. If you educate yourself more you'll tap into a wealth of knowledge about how to get things done more powerfully. If you use that as an excuse to step away from solving problems then you'll likely solve fewer problems, so hopefully you'll investigate things with greater theoretical validity (or else why bother?).

More people using tools means that more people will become more informed and get more stuff done. Tools should not be made as to be inaccessible. But a technique or idea often begins as inaccessible and there may even be no way to make it more accessible. This does not mean that it is invalid and it may be that it gives people who master it greater power.

Whenever learning, don't learn for learning's sake and don't learn to meet some end point defined by your teacher---seek what the master's sought. They often are solving real, practical problems and by training yourself to look for that perspective you can understand how to arm yourself with their knowledge in the way that they did.

A person who generates knowledge in effort to solve a hard problem is usually one who is more than willing to throw that knowledge away if it does not serve them.

Many problems are no longer as hard of problems as they once were. This is directly due to the fact that advanced technologies have become increasingly approachable and because larger bases of people have improved their mental technologies so as to use them. This is a great thing, but it's not an end state---harder problems still exist and gains can be made by tackling them. People can learn more things and more powerful ideas can be made the foundations of new mental technology to improve anyone's ability to solve problems.

None of this is controversial.


So I can see where Bret is coming from, many of this thoughts resonate with my ideas, but I'm not so sure there's an intentional conspiracy so much as just a simple alternate perspective that hackers have.

We naturally think of things slightly different than non-programmers. Part of the reason the tools he mention were despised by "all true hackers" is that they were limiting frameworks. We, almost by definition, tend to dislike limitations (especially when we can wield all of the power of lower level languages).

I was just pondering the other day with a co-worker, what it would be like to write a web server in xl (not sure you could, or would, but what would the mental model be like). It was an interesting experiment.

It's a similar reason to why we tend to prefer libraries to frameworks. I like code that helps me get things done quicker, but not if it enforces some mental models or comes with implementation limitations. Just yesterday I was working around a problem with a certain framework that wasn't making the right system calls the way I needed it. I knew what I wanted from it but couldn't twist the right nobs to get it to happen.


I think the real reason is more along the lines of what they are suggesting.

Programming means writing obscure ASCII codes.

Programmers are afraid to use any tool or language that doesn't look like source code or is easy to use. Because if you are doing that, then you aren't programming. And therefore, you're not a programmer.

If its not complex or difficult for normal people to do, then you are not really programming, and therefore not a real programmer.

All programmers are therefore afraid of tools that make programming easier, because they are afraid they will be judged by others as not being programmers.

This is the main thing holding back programming.


This is fantastic - I am completely in support of this. Remembering how much money the few people who still remembered COBOL made in the 1990s, I can't wait to clean up after the "normal people" get their "shit done".


This is brilliant stuff. And yes, it's satire. Wish I could upvote a bazillion times.


Letting off a little steam about this topic is good, it's irritating to me too. Hopefully this doesn't get taken out of context and actually attributed to Bret though.


Ah another person talking shit about category theory. Category theory will make things simpler. Just give it time, it's very hard to work with that level of generality when you're not used to it and the dogma has not prepared you and the tools are all new.


The person speaks about design. When system (our programming tool) is visible (read Donald Norman's "Design of Everyday Things"), provides immediate feedback, shows clearly defined constraints than the system is usable. When "you need the Category theory to use the system" (the theory is magnificent, it's a marvel of human mind, etc. Nobody spoke anything bad about the theory itself) than it's an embarrassingly awkward attempt to conceal one's failure at designing usable programming tool.


You're conflating tools with products.

Having to understand electricity, trigonometry, and basic signal analysis to use an oscilloscope does not mean that oscilloscopes are not useful tools. Those are simply prerequisite knowledge to work as a professional in the field, and the tool is made for professionals.

Nobody is complaining that imperative programmers have to learn how memory works, how conditional control structures work, etc. And yet when functional programmers can benefit from learning category theory, suddenly it's a huge problem. This is just anti-intellectualism, nothing else. The term category theory scares people off, and so rather than acknowledge that failing within themselves, they throw it back on the functional programmers and say that programmers shouldn't have to learn category theory.

NB: You really don't need to learn category theory to program effectively in Haskell. You can benefit from it, but it is absolutely unnecessary.


Ok, we can stop with the satire now. We all know monads are part of a long running inside joke that we normal humans will never really get.


Haskell 1.0: 1990. CAML: 1985.

When then?


Whenever you actually put in the time and effort to learn and understand the abstractions. There are plenty of people who have done so and who are enjoying the benefits of having done so. I'm still a novice by comparison, and I still find my understanding of category theory as implemented in Haskell to make my life easier on a very regular basis, to the extent that coding in other languages for my day job is becoming more and more painful as time goes on.


Well, given that Haskell 1.0 didn't use category theoretic structures in any serious capacity and it's IO was based on lazy streams, I'd say it's already made things easier in Haskell.

How long did it take for calculus to make things easier for most people?


Looks like somebody just read " How 'One Weird Trick' Conquered The Internet" article and used all the techniques described there to deceive HN users. Maybe this is an experiment. If so, you all must be ashamed that such a big part of you fell for it. And even trying to have a moderate discussion on a matter. "Hackers hate him! How 1 weird trick would help 5 year olds learn programming!" turns out work here as well as for others, all you need is more advanced deception.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: