Hacker News new | past | comments | ask | show | jobs | submit login
Things Many People Find Too Obvious to Have Told You Already (threadreaderapp.com)
854 points by JoshTriplett on Dec 1, 2017 | hide | past | favorite | 487 comments



> Companies find it incredibly hard to reliably staff positions with hard-working generalists who operate autonomously and have high risk tolerances. This is not the modal employee, including at places which are justifiably proud of the skill/diligence/etc of their employees.

This one. Every job I see when I go looking is 40+ hrs a week, likely with plenty of idle time and projects I don't believe in. Part of this is regional... my area is packed full of marketing jobs that I'm actively uninterested in. Anyway, I'm starting to get the hang of selling myself to contract shops, and I think I've found some twinkles of hope.

But today I was told straight up that my resume wasn't conventional enough, that it was too well-rounded. After quite a few messages back and forth, I'm not even sure if they even looked at my portfolio. They told me there was no money for me, because their funders were looking for someone more professional. So I took this as a sign that maybe this wasn't the job for me. No need to go barking up the wrong tree.

In the game dev world, I'd probably be a "technical artist", but even that doesn't quite fit for me. I'm like a "fool of all trades, jack of some, maybe a master of one or two". But it's hard to find a job that motivates me and leaves me with adequate space/time/energy to keep myself sharp and alive.

If anyone knows of funding for a guy that makes creative, reusable web components, let me know... edit: I just saw that vue.js is primarily funded by patreon. that's awesome!


It would rarely be effective to market yourself as a generalist on paper.

If you have 3 primary skill sets you should have 3 focused resumes.

If you want a position that requires a generalist, the only/best fit would be founder. In almost all other circumstances, companies hire for a problem and not general optimizations. Even if they need a generalist.

I would posit that you have a marketing problem and not a skills mismatch problem. The noise to signal ratio on your resume may be making it seem to not match any position.


Thanks for your insight. It really resonates with me.

I think you're very right about my marketing problem :)

On the surface, my website is kinda out there. I built it from the ground up using Rake and some gems as a static site generator. I think it's very cool, and I don't expect that many people will appreciate it, but I'm hoping that one day someone comes along and says "wow, that's pretty cool, wanna help us with this Ruby thing?")

In a deeper way, this whole discussion is very helpful... it's giving me a fresh perspective on how to frame conversations with future potential employers.

And actually the other day I had an epiphany about my marketing. Sure, I chose an odd path, but I've already done so much of the work! I just need to get my random little open source tools in decent, usable shape so that people who'd find them useful might actually get a chance to use them. I need to get my half done work finished and make tutorials about it and get myself out there.

This came after I got a call from a contracting company who told me they were bidding on an interesting job that might be a good fit for me... I was so excited that I stayed up all night and finished like three old random projects at the same time, just so I could put more stuff on my portfolio. It was a wakeup call, affirming that there actually just might be some cool work out there after all... so get your shit together!


This is an interesting idea that I'll have to try.

I have the 3 primary skillset problem you describe and have a decade of "founder" experience. I thought the breadth of experience and technologies would be impressive, but I think it just confuses most employers. It seems counter-intuitive to intentionally not mention your skills on your resume, and admittedly a little ego-deflating.. but this actually makes sense.


I just thought of something else: it might be a good idea to go around and make a collection of all the different job titles that broad-knowledge people (e.g. "technical artist", "full stack") are given in different industries, and find the ones that fit you. Then study the roles. That way you'll have a language for looking for jobs and will have a better idea of where you might fit in and how to have conversations about it.


Your assertion about generalist ~= founder is pretty spot on. I've spent decades working lots of different job titles that interest me (or as the needs of the organization dictated). Putting all that down on my resume hasn't helped me much.

I was heading the direction of very focused resumes. I've also landed a low pressure founder role too, so that's nice.


A team lead position on a team that has autonomy is also a good place to be as a generalist. It's where I see myself providing the most value to my company. I've had some opportunities to move up to Director+ but blegh.


If you've worked in the different specialties at different jobs (rather than interleaving all 3), won't focused resumes have glaring temporal holes in them?


The idea is not to omit experience, but spend more real estate on the focus topic. Writing a longer description for the experience that really matches the description, and even wording less relevant experience to so that it sounds more relevant.


I think you framed your anecdotal evidence as supporting the OP's statement, but also that you misunderstood such statement; and you actually present evidence against it. If companies find it hard to reliably staff generalist positions, _and_ you're a generalist as you claim, then you would find it _easy_ to get a job.


I have shipped React and React Native apps. I've built nontrivial open source infrastructure/devops tools that are used at Companies You Have Heard Of. I've done backend stuff for NASDAQ-listed companies.

I'm not a web or mobile developer. I'm not a "devops engineer", and I'm not a "backend engineer." I'm a problem-solver. But to many (most?) companies, this means that I need to be classfied into a track. I need to be an X or a Y or a Z. I can't adopt those roles when necessary--I need to be slotted in and just go.

That doesn't really work for me. Which is a major reason why I'm a software consultant now. I'd go do a full-time job that respected my complete disinterest in being pegged to a particular role and was able to give me the latitude to solve cross-cutting problems at scale. But that's a hard role to define, a hard role to hire for, and a hard role to evaluate--which is why they, and other generalist-focused roles, don't often exist.

It's easy to get a job. I can walk into almost anywhere in Boston if they're hiring and get an offer; I interview extremely well. I can't get a job that actually leverages my skills.


Assembly lines beat generalists. As companies grow they can exploit this to gain increasing efficiency with minimal effort. So, either work for small companies, or be the front line manager who picks up whatever slack is missing to keep things on track.


Try consulting or entrepreneurship.

When you sell your time, you need to fit into a role that someone can comprehend and already knows they need.

When you sell your ability to deliver results, nobody asks about your skillset.


You might have noticed that I said that I am a software consultant. And the pipeline is pretty good. But it's a different thing than actually having something consistently interesting to do.

I don't enjoy entrepreneurship.


Same here, I work independently for a few years now and it just becomes less interesting, and for interesting problems where you'd learn something new (I.e. where you are not good but you know you'd get there quickly) you cannot maintain your rate.

I keep wondering if I should take the rate cut for a year or just study mathematics at the local university (which is free here) ...


Genuinely curious: how would studying maths solve your specialist/generalist dilemma?


Math is multidisciplinary like none other. You could use graph theory, for example, to study relationships in music, sociology, chemical reactions, transportation systems, markets, ecosystems, whatever. And the insight you gain in applying it from one angle will almost certainly provide a fresh light when you pivot back to a different application.

Having a math PhD could also help cut through the bs of people thinking you're not useful because you're a generalist. Suddenly, they'd be asking you to help them with weird little problems. Which is what I'm pretty good at like to do!


I've thought about going to study math too.


Are clients are hiring you as a team of one? If so, it’s easy to see you as replaceable. If you assemble a team then you get to work on more interesting and longer-term projects.

Either you fit yourself into the structure of someone else’s business, or you’re in business. If you’re in business, there’s no lack of diversity in what you spend your time doing. That’s what I mean by entrepreneurship, not “SV startup entrepreneurship”.


You know, that's a good point. When I consult through some of our locals it's very much a team-of-one sort of thing because we're probably too expensive to rope together too many people onto one project. (Not that we're actually that expensive, but false economies around expertise exist throughout tech.)

Something to think about. Thank you.


Sheesh, get over yourself. I'm not trying to be a jerk but this is such an obnoxiously braggy complaint.

Yes, companies have complex problems to solve, that require more than a few days or weeks of work at a time. If you want to just wake up every day and decide that you need to work on a totally new project, you're not going to be very useful.

Lots of people have a wide range of interests, but you pick one of the many areas that seems interesting and that you can contribute to, and you work on that for a while and try to make a difference. "Adopting a role when necessary" is not that different from slotting in to a team, since you can often change teams every few months if you want to, or even work across teams.

What you describe is a hard role to define and hire for, but it's a role that many engineers create for themselves by being both great problem solvers and willing to do what is most valuable for the company. Consider that maybe you haven't gone in with the right mindset to make it happen for yourself.


You've heard of "be a multiplier, not an adder", right? My contention is that many good developers can be multipliers across very large portions of an engineering organization when given the chance. Dysfunction does seem to be the natural state of any business, and engineering groups are no exception; technical and social resources capable of and empowered to tackle cross-cutting concerns are that chance I'm describing and provide self-evident benefits over the long term. You're right, that sometimes it happens informally. I am asserting that institutionalizing it makes teams faster, less risky, and happier.

I also happen to think that I'd be good at it, because it's kind of what I already do, and I wish it was more common because that'd be great. Heaven forfend. I mean, you can try to pull the well-actually-it's-labor-that's-bad thing by trying to turn it into my mindset is wrong rather than let's talk about executive and managerial allocation of resources and whether we do a good job of it across the board, but the thing speaks for itself.

And you're right, that post is braggy. It's also marketing: from this thread I've gotten two emails inquiring as to the state of my pipeline and if I'd have time for a chat. It can be both self-marketing and a reasonable observation of reality. (Email's in my profile.)


I have not heard of 'be a multiplier, not an adder' Can you elaborate?


Be one that makes changes that make a team perform 2x - 10x - Nx better, not someone who adds a constant contribution to the team.


> If companies find it hard to reliably staff generalist positions, _and_ you're a generalist as you claim, then you would find it _easy_ to get a job.

It's incredibly hard to get a job. But your versatility will ensure job security for as long as you care once you're in the door, thanks to your ability to get into the critical path of different department's operations.

I'm a competent, high performing generalist who can work autonomously and self-sufficiently with enough confidence in my capabilities to have a high risk tolerance. In my current company I got tossed around from starting under a marketing manager, to getting shifted under the CTO after stepping on IT's toes too often, to getting pushed into a newly formed Operations group, to finally just reporting directly to the owner. While officially I manage a data management and BI team, in practice I'm the guy you escalate random problems and needs to. Creating a BI team was a byproduct of being a generalist and being the first person that needed a consolidated view of everything (and having the ability to execute on that need).

Looking for a new job is a pain. Companies don't explicitly hire for it, politics can come into play if the generalist doesn't know how to dance around the company in a way that doesn't threaten others' fiefdoms, there's no easy way to suss out competence from arrogance during an interview, no manager wants to frontload your salary on their budget when they won't be getting the full benefit of it, and a host of other things.

Generalists are valuable to a company, holistically. They're oil to the machine and fit in between all the specialized cogs to smooth things out and let you run the engine hotter without breaking down. But managers only manage their own cogs, hiring only looks for the cogs they're told to find, and at the end of the day whether there's oil in the machine is Someone Else's Problem. Very few teams exist which have the prerogative and budget to hire for that holistic need, so most companies just go without oil and don't think about it ¯\_(ツ)_/¯.


> "I'm a competent, high performing generalist who can work autonomously and self-sufficiently with enough confidence in my capabilities to have a high risk tolerance. In my current company I got tossed around from starting under a marketing manager, to getting shifted under the CTO after stepping on IT's toes too often, to getting pushed into a newly formed Operations group, to finally just reporting directly to the owner. While officially I manage a data management and BI team, in practice I'm the guy you escalate random problems and needs to. Creating a BI team was a byproduct of being a generalist and being the first person that needed a consolidated view of everything (and having the ability to execute on that need)."

Your job and experience sound very similar to my current role. I don't have an accurate job title anymore, and I very much see myself as the oil that makes the engine of the company run more smoothly. I wear about four or five different hats, all depending on what fires I need to put out. Good to meet someone doing something similar.


I've said this in a sibling thread, but most shops I've seen that doesn't have This Guy are worse-run, less friendly, and higher business-risk shops. (Which isn't to say that counterexamples don't exist, but I can't think of a good one in my experience. The best companies institutionalize the role.)


Part of the reason my job needs to exist is to address lack of adequate staffing levels and dysfunctional inter-departmental practices. I would say the company I work for is transitioning between one that is haphazardly run to one that is more professional, but that transition doesn't happen overnight, and you need people to pick up the slack during that transition.


Very true. From what I've observed, it's because a firm with This Guy allows that role to absorb all the odds and ends that pop up and need handled, while allowing specialists to focus on what they do best, maximizing their work output and general work satisfaction.

A firm without This Guy means that more small problems/needs get ignored since it doesn't clearly fall into anyone's scope of work or warrant the cost associated with the specialist. And if it gets large enough to not ignore, a specialist will consistently have to context shift to something outside of their specialty when something tangentially related pops up. This decreases their productivity considerably, decreases their satisfaction as they can't focus on what they truly enjoy doing, and increases business risk as things get overcomplicated or slip through due to a more biased view on the strategy to perform the work (based on whatever the individuals specialty is).


How did you get into a role like that? Sounds like something that would be fun.


It is fun. As for how I got into it, I got hired for a different role (test engineer), proved myself in this role but also took on extra responsibilities outside my remit (I should note that I also had worked in various different technical roles in previous jobs). An opportunity came up to be a founding member of a new Business Analyst team and I took it, but because of the structure of the company I'm in I didn't leave the IT Development team fully, so now I sit somewhere between them. In the space of a week I can be doing data analysis, business analysis, software testing, software development and user training. The hours are long, but I'm seeing improvements in the business culture gradually coming in as a result of this work, which is satisfying.


> most companies just go without oil and don't think about it

Practice-wise, it astounds me how many people/institutions expect different results from everyone else while hewing to the same practices as everyone else. Why do you think you'll be the exception? Why can't you look at the reality of what is working and what isn't?

This applies to so many areas of life, but it is extremely pronounced in business, where everyone cargo cults every little practice of AmaGooFaceSoft and believes that is what made them who they are. It's like there's an extreme deficit of original thought.


A lot of comapanies have "A" teams they send in when necessary. The US Digital Service was one example of this.


I think the anecdote and OP’s statement support one another - companies have a hard time reliably staffing generalists and one of the reasons for that is that they are bad at identifying them and determining how to use them.

Parent showed them a generalist skill set and they saw a lack of professionalism.


It'd be straightforward to find/get a programming job that'd more than take care of my material needs. There are a buttload of entry/intermediate level jobs out there that'd provide more money than I even know what to do with. If I wanted a job really bad, I could study up for it, put the technologies they're looking for on my resume and apply.

The hard part is giving in to a job. I think my hesitancy in this area straight up reeks to the point H.R. people/recruiters can pretty much smell it. It's just not about the money for me. If I was doing something I enjoyed and had time to work on creative projects (a.k.a. "my life's work"), I'd be fine on ~ $1000/mo (say at ~20-40 hrs/mo). But this presents difficult hiring/management challenges, the main one being the challenge of using my time effectively. When someone is always on call, management doesn't have to worry about this as much.

Honestly, I think companies could get a lot more bang for their buck if they put more money into hiring people to maintain and improve on open source projects that they use. And I know that there are jobs like this out there. I just need to keep looking for one that fits me :)


The idea of failing at "giving in to a job" resonates so much with me. Thanks for taking the time to formulate this thought into words.


You think you're too good for a job, hence, you are unemployed.


I went through my entire 20s thinking this, and I have strong opinions about it so sorry for this long post...

Back when I quit my 9-5 job in 2006 my favorite quote was by the creator of Winamp... "For me, coding is a form of self-expression. The company controls the most effective means of self-expression I have. This is unacceptable to me as an individual, therefore I must leave."

No company deserved me because I was a creative innovator, and even though working retail for $10/hr wasn't using my CS degree, I'd rather do that than make $70k getting someone else rich. I thought I deserved $250k since I was obviously going to get them rich - so taking the $10/hr job doing mindless work was less of a sell-out.

The next 8 years I tried a lot of things, and went $20k in debt. When I reached my 30s, I realized a few things:

1. You can be a mercenary and get the skills you need for your own business while working for someone else. You learn and get faster while doing tasks for your employer, so even though the code you wrote is owned by them... the code you write for your own business will be better and done faster.

2. If you're worried about getting them rich... just do what they say instead of innovating. Most companies don't innovate, they just maintain the status-quo and have you build systems that meet existing needs and save a little bit of money. You aren't making the next Snapchat at a 9-5... and if you are, you should have equity.

3. Freedom is not just about having the time to do things.... I had all the time in the world in my twenties but no money. Being broke basically put me in a prison. I couldn't go on trips with friends, couldn't take girls out, couldn't do what I wanted because I'd have no money for it. Now that I have a full-time job and work on side projects, I have the opposite problem where I have money but no time.

Overall I think having a 9-5 job works better for my lifestyle. Maybe I just needed more discipline and I could have succeeded as an Indie developer making games, marketing software, apps... and whatever other shiny thing caught my eye... but now I do that stuff on the side. I still hope I succeed as an entrepreneur and reach a point where I have complete freedom (time & money)... but now that I'm older I realize that was always a long-shot.


There's nothing wrong with making people rich in exchange for money they give you. That's how jobs work. Making someone else rich is something to be proud of and aspire to. Not as cool as making yourself rich, but still good.


I don't mind if my boss or the CEO of the company gets rich, I think they're good guys. I'm just worried they would sell out instead of keeping the company private.

Take Tumblr for example - they sold to Yahoo. Verizon bought Yahoo. If I was an engineer at Tumblr I'd be pissed that an evil monopoly like Verizon that's fighting against net neutrality is suddenly using my code.

I'm more humble now than I was in my 20s, and I don't think my code is anything special, but I still fear that the private company I work for will sell everything I built to someone I consider evil. If I have no control and no equity I'm going to act like a mercenary... I'll do the job efficiently and according to specifications, but they don't get anything extra. If they have an idea and have me build something that gets them rich that's great - but I'm not going to be like the guy who built GMail on the side at Google... I just do the job they tell me.


You're afraid the code you write for an employer might be used by an evil company. We could apply the same argument to taxes. The taxes you pay may help fund actions that you believe are morally reprehensible. To a large extent, these things are beyond your control, and by overanalyzing you're simply limiting yourself.

So what I am saying is at this point, do not worry about all these things. Keep up the side-projects, and if you do that over the long-term, you may one day find yourself in a situation where you have the freedom to dedicate your efforts to what you find meaningful.


> Take Tumblr for example - they sold to Yahoo. Verizon bought Yahoo. If I was an engineer at Tumblr I'd be pissed that an evil monopoly like Verizon that's fighting against net neutrality is suddenly using my code.

But wouldn't you be happy that all your fellow coworkers also probably got to benefit from the sale? If you or your coworkers got rich enough then you could all just focus on doing things that you think matter or better the world!


I don't think that at all. I actually admire and respect people who are able to buckle down and focus on a full time job. I also acknowledge and respect that this is not a fit for me. There's nothing wrong with me or with people who that works for, and this is my challenge. Please don't interpret my frustration as a condemnation of traditional employees/employers. I'm learning how to navigate.


WHere did you get that from? Does this not seem like a really rude thing to say to you? Do you not value the well being of others?


you think there is stigma in being unemployed.


A salaried job that only gives you 5-10hrs/week is a niche unto itself.


Your absolutely right, and not even salary jobs, just part time jobs:

https://tech.mn/jobs/?q=&type=1&role=0&cat=any

vs.

https://tech.mn/jobs/?q=&type=2&role=0&cat=any


Another way to see the statements as non-contradictory is: since companies cannot hire people like this, they structure themselves such that such people are not required. Then, when one does happen along, the company has no place for him.


Or her.


> Parent showed them a generalist skill set and they saw a lack of professionalism.

Yeah, thats part of the pattern. Generalists often do not appear as professional as specialists (which does not mean that everybody who looks unprofessional is a generalist).


Right. And believe me, I'm not the pinnacle of professionalism, but I have core values of respect and a motivation to do work that people find valuable. Last time I checked they are pretty well intact. I believe this person mistook specialism and accolades for professionalism.


I think there's a difference of terms. Wanting to work 20-40 hours a month on stuff you like instead of "giving into a job" sounds pretty far from what most people are going to consider "a professional" (you work on what the job needs at the present, with standard hours, that they pay for).


This explains why everyone in tech fights over space on the latest bandwagon every time no matter how astoundingly mediocre the latest thing is.


It's strange, but I think at least some companies find it hard to staff positions, while also at the same time ignoring/rejecting suitable candidates.

Case in point - I'm a generalist who can very credibly claim very significant expertise at two disconnected areas (compilers and distributed systems), plus some startup/ product management expertise. I saw a job listing here for a startup that needed both of these and claimed they're looking for remote work - so, even though I'm not actively looking for a job, I wrote them; the match between my skills and their needs seemed too good to be true.

They wrote back the standard rejection mail - didn't even try to talk to me. I thought that maybe they found someone else, but no, they still advertise the position. I know it sounds arrogant, but there's no chance in hell they can find many better candidates (I doubt that they can find even one, but well, good luck).

Are their recruiters uniquely ignorant? Unlikely. I think many companies do a lousy effort on recruiting, and then wonder why they can't fill the positions. One of the reasons "internal referrals" work so well is that the candidates are at least seriously considered.


I would bet my lunch that your scenario was a keyword mismatch. Always keep in mind that the "hiring person" at startups is usually the non-technical person. So you have to get past the skills list provided by the CTO/CIO/Lead Dev.

I have seen three approaches to limit this. First being to stay super field specific but implementation general, but not keyword void. Second, is to go super rabbit hole on every keyword/tool/language/platform you have ever worked with inside the field.

The third approach is to have a type one resume with a type two cover letter. That approach has usually been the most consistent for callbacks in my circles, and allows for your rabbit hole to be of the same type as the job description.


You might be right (though there may be other explanations, e.g. ageism or just wanting local employees, and writing "remote" just for the heck of it, "to get more leads"). But even if you're right, that just reinforces my point: I'm not desperate to get a job; I wrote to have a discussion, because they managed to pique my interest... I didn't put any kind of effort to "market" myself, thought they'll figure things out in the interview(s). They were basically fortunate that their advertising hit he right target. And what did they do with that lead that they got? They throw it away through a non-technical filter....


I think that’s an interesting cross section. I’ll talk to you. Email in profile.


These kind of job listings may just be used as a cheap form of advertising to show investors and (potential) stakeholders that the company is growing and - therefore - doing well.


Companies sometimes aren't honest about their job postings asking too much, and instead blame the worker population on not having the skills they need.

The more skills you are trying to get filled for one position, the less likely a candidate will match all of those skills.

Take any job posting with 20-30 lines of skills or requirements and it's really no wonder they can't find someone that meets all or even half of that.


I’ve definitely seen postings around asking for 4+ years experience in React or something equally ridiculous (React has existed for 4 years).

I think that hiring practices haven’t really caught up to pace of reality yet. Nobody does the same thing for more than 5 years anymore, even if you’ve been at a company that long what you’re doing today does not look like what you were doing then. The chance that a new hire already knows all the parts of your stack is basically zero. What you want is someone who soaks up frameworks like a sponge.


Not to mention the companies that specifically write the ads so that they can claim their person doesn't exist and bring in an H1-B holder for half the rate. The fact that these shysters ruined it for the companies that actually need good people at good pay is a major problem.


There is a massive gulf between hiring and work. There's another massive gulf between what a manager asks for and what HR or recruiters ask for.

> "If companies find it hard to reliably staff generalist positions"

The process of hiring for this is hard. Finding people like this is also hard.


There's also gulf a between what companies think they need, and what they actually need. Almost everyone hires on technologies and not the abstract skillsets required to learn said technologies.


Any Idea how to find a company that hires on that abstract skillset, potentially by throwing one into cold water to test for it?

~Someone with little resume-driven experience but good grogging skills


Honestly, the way I got hired recently was by just reading job postings and cramming on the frameworks mentioned in them until I was competent enough to pass an interview/coding challenge. I’m a little peeved that it was essentially company training on my dime, and you have to count on companies not refusing to hire unless you have x years, but it did get me hired.


Big companies understand this and that's reflected in their hiring practices. They don't care what skills you have, just that you're smart.

The way they throw you into cold water is using whiteboard interviews. So if you don't like those... sorry.


No, I wish. The closest you'll get is companies that might hire on transferrable skillsets (e.g., Ruby <-> Python). The best thing you can do is have a portfolio of work that shows your understanding. It's what we had to do when I was doing graphic design and illustration work.


I got found by a ~10ppl YC company and the deal fell apart over $12K.


exactly! and even if you do find them, convincing them to work for you is also hard... lots of these people are don't respond to money in predictable ways.


Companies find it hard to staff generalist positions because they don't know how to interview/hire generalists. If a generalist interviews with a team that has several specialists then one of specialists can easily knock down the hire decision because the candidate isn't _as_good_as_ the specialist.


Something being hard does not imply it's expensive. There are many ways hiring some kind of people can be hard even with a massive amount of them out of your door looking for a job.


One of Marx' theories was that people are estranged from their labour by making it a purely economic entity. To me that's exactly the reason behind the reactions in this thread (and also why I feel the same).


I'm a fellow generalist. I got lucky that the company I'm with values my work, but I can see something that isn't said out loud about generalists but appears to be true (in my experience at least)... generalists are harder to manage.

If you've got someone who is largely autonomous and can wear multiple hats, steering them in a single direction may be more tricky than a specialist that doesn't seek work outside their remit.


>But today I was told straight up that my resume wasn't conventional enough, that it was too well-rounded. After quite a few messages back and forth, I'm not even sure if they even looked at my portfolio. They told me there was no money for me, because their funders were looking for someone more professional.

Assuming your skillset is good, run, don't walk, away from those idiots.


Some resources you may find of interest:

* "Have Fun at Work" by William L. Livingston -- Livingston provide examples of how organizations are often incapable of hiring the people who might help them most.

* "The Murdering of My Years: Artists and Activists Making Ends Meet" by Mickey Z. -- This book is more in the category of seeing how bad so many creative people have it (until perhaps we get a basic income some day, Star Trek replicators for subsistence, a gift economy, and/or better government planning).

* "Drive: The Surprising Truth About What Motivates Us" by Dan Pink: https://en.wikipedia.org/wiki/Drive:_The_Surprising_Truth_Ab...

* "Punished by Rewards" by Alfie Kohn: http://www.alfiekohn.org/punished-rewards/

* "The Abolition of Work" by Bob Black (which reflects your point in another comment on "giving in to a job"): https://web.archive.org/web/20161031034600/http://whywork.or... "Only a small and diminishing fraction of work serves any useful purpose independent of the defense and reproduction of the work-system and its political and legal appendages. Twenty years ago, Paul and Percival Goodman estimated that just five percent of the work then being done -- presumably the figure, if accurate, is lower now -- would satisfy our minimal needs for food, clothing and shelter. Theirs was only an educated guess but the main point is quite clear: directly or indirectly, most work serves the unproductive purposes of commerce or social control. Right off the bat we can liberate tens of millions of salesmen, soldiers, managers, cops, stockbrokers, clergymen, bankers, lawyers, teachers, landlords, security guards, ad-men and everyone who works for them. There is a snowball effect since every time you idle some bigshot you liberate his flunkies and underlings also. Thus the economy implodes."

Contrast with notions of "full employment". Or, sadly, also most of what Silicon Valley is up to these days...

About a decade ago, I sent that last link to an Apple recruiter who contacted me -- never heard back. :-) Not that I am a "Steve Jobs" or would want to be one, but, as with Livingston's point, would Apple hire a Steve Jobs as an employee? Almost certainly not. See: http://careerfuel.net/2013/06/if-apple-wouldnt-hire-steve-jo...

Though for balance, on the value of working together with others even in difficult circumstances, see "Buddhist Economics" by E.F. Schumacher: http://www.centerforneweconomics.org/buddhist-economics "The Buddhist point of view takes the function of work to be at least threefold: to give man a chance to utilise and develop his faculties; to enable him to overcome his ego-centredness by joining with other people in a common task; and to bring forth the goods and services needed for a becoming existence. Again, the consequences that flow from this view are endless. To organise work in such a manner that it becomes meaningless, boring, stultifying, or nerve-racking for the worker would be little short of criminal; it would indicate a greater concern with goods than with people, an evil lack of compassion and a soul-destroying degree of attachment to the most primitive side of this worldly existence. Equally, to strive for leisure as an alternative to work would be considered a complete misunderstanding of one of the basic truths of human existence, namely that work and leisure are complementary parts of the same living process and cannot be separated without destroying the joy of work and the bliss of leisure."

I've collected some more ideas on making workplaces better here: https://github.com/pdfernhout/High-Performance-Organizations...

From what you describe, if you can't find a direct match to what you would like to do, you may want to consider being frugal and then earning what money you need to live in a low-stress part-time occupation with the rest of your time then available for what you really want to do. Have you ever considered, say, cleaning carpets to make ends meet? https://web.archive.org/web/20030807105050/http://www.unconv... "More than a few people agree the best career would be one which provides challenge, intellectual stimulation, and rewards for quality work. Many however, would be surprised to discover they can have all of those benefits and more in some of the unlikeliest of careers. Case in point: I'm a professional carpet cleaner. Some people think this is a second-rate career. I don't agree with them. Carpet cleaning gives me challenges, intellectual stimulation, and many other rewards. To prove this, permit me to walk you through one of my work days. ..."

If instead you focus on "selling [yourself] to contract shops", you may find this book Aaron Erickson useful: "The Nomadic Developer: Surviving and Thriving in the World of Technology Consulting".


This is great, than you! And yes, I'm pretty much doing this now. I live with musicians who hustle very hard. I'm a musician too. I'm familiar with the life. I'm very happy and fulfilled in so many ways, just exhausted.

I think one of my best bets right now is to lean hard into teaching, especially piano teaching. I have one committed student and a few more excited but broke students who take lessons from time to time. Is very rewarding and there is lots of mutual respect.


I'm not sure what he meant by "modal employee". Is it in the statistical sense of occuring most frequently?


Looking at it, I think he means the statistical middle employee.


I thought it was a typo for "model employee", but your interpretation makes more sense.


I think that's exactly what he meant.


many people can have skills to do x but if I cant rely on you to do that for me for the next year at a high quality then I dont want to take the risk.

You are not just a good frontend engineer you are a good frontend engineer that is competing with other good frontend engineers. You lack professionalism (it sounds like) so you lose against those guys.

Good luck, at some point you either find the place that works for you or you learn expand your skills. Seems like you need to improve your tolerence for down time. It could be worse


I don't lack professionalism because I'm unable/unwilling to provide the same level of support that full time specialists/shops offer. I'm just not a fit for jobs where that's what needed. Maybe that makes me less of a specialist, but it doesn't make me inherently unprofessional. And a lot of times that's not what's needed! As an artist I'll tell you that in art worlds there are plenty of professional jobs where you do something well and walk away... There are others who are better suited to taking what you've built and integrating it over a longer term, particularly, it's people who have stake in the project.

Have you ever done freelance web stuff for small businesses? There are some people who want you to be their webmaster... You give them a CMS and they want you to go fix their typos. So you say "Okay, sure, but it'll be $50/hr" and then they get frustrated because they don't have someone doing that kind of work on their end. They think it's your job to "make their website", but don't want to take ownership of the thing.

This is an analogy for the kind of thing I'm taking about. I want to help solve problems, and I want to provide value for my employers, but I'm unwilling mindlessly give up my time to people who don't value it outside of their myopic needs. A lot of times, that's what's expected to be "a professional". Well I don't want that. I'd rather be an amateur who's also a respectful and savvy business partner, who's definitely interested in helping you with your project but also keeps healthy boundaries for myself.

I'm definitely more of "an amateur" (lover of) than a professional, but I don't see why that should make me automatically "unprofessional" in my business relationships. On the contrary, I think it's fully possible to "be professional" without providing unrelenting 40+ hr/week service for a year. If that's not what you're looking for then with all due respect that's fine! But please don't blanket assume I'm "less professional".


> CS programs have, in the main, not decided that the primary path to becoming a programmer should involve doing material actual programming.

Well, that's because CS programs are designed to teach CS, not to be a vocational school for programmers.


I've been spending a lot of time lately wondering if people who treat colleges and universities as though their job is worker training are confused, or if I'm naive in believing colleges and universities exist to expand human knowledge through research and education.

I suspect the answer is that our culture is confused, we want people who are smart and well rounded to also be useful. And the ugly side of it is, we want clear class divisions. One would never refer to law school or medical school as "vocational".


> One would never refer to law school or medical school as "vocational".

Sure you would. They are 100% vocational schools. Their cost, acceptance rates, time commitment, and difficulty level may be higher than other vocational schools, but that doesn't change what they are -- 100% focused on giving you the knowledge and skills to do a specific job.


Law schools (or at least the top 20 school that I went to) are definitely not 100% focused on teaching you the vocation of being an attorney.

Hell, they aren't even focused on teaching you to pass the bar, which is why most people pay for separate cram schools.


Yet, going to law school is a pre-requisite for becoming the vocation of a lawyer.

Arguably, "a school one must go to in order to practice a vocation" can be referred to as a "vocational school." I understand that vocational schools are often thought to be purely practical and not theoretical, but I think this definition does a disservice to the reality of entering certain vocations.


No, it is not a pre-requisite. At least in the U.S., several states still allow you to pass the bar and become a licensed attorney without going to law school.


That's the point - they are, but are not referred as such.


They are in fact called "professional schools" -- this captures what both of you are saying.


They are actually called "professional" schools, but yeah it's a euphemism that means the same thing.


Well then where are all the "vocational" CS programs at universities?


Canada has ‘software engineering’ degrees which are essentially vocational CS degrees.


To add to this, Canadian Universities also have co-op programs.

You don't have to have an engineering degree to get a co-op job. The student gets paid well over minimum wage (and learns on the job) and the employers get financial incentives from the gov't to hire students. Its win-win for both the student and the employer.


Software Engineering at Canadian universities does not contain a dramatically different level of vocational training (with the exception of Waterloo).

Co-op programs are optional, and take place in the summer or in a gap year. The affiliation of companies hosting co-ops is loose, and the schools charge large fees for participating in these programs (with the exception of Waterloo).

Speaking from personal experience, results may vary.


Which Canadian schools have software engineering degrees which are vocational?

Note that my degree is from a well-reputed Canadian school that has a "software engineering specialization", and as far as I can tell (though I did not take it) that specialization is indistinguishable from the main CS curriculum, and is not valid vocational training.


Medical and Law schools are postgraduate.


In some places. In Ireland, law is a normal undergraduate degree with some extra vocational training after you graduate. Medicine is an undergraduate (5 year) programme, during which quite a lot is vocational - students spend multiple months in hospitals during that period, and when they graduate they are still considered as training.


Coding Boot camps


... which are not at universities.


And will also not prepare you for a job in computer science.


They might prepare you for an intro-level job in software development though, if you actually continue learning things and honing your skills after completing them.


For what it's worth, these bootcamps are the current "trade schools" for software jobs where the barrier to entry is lower. They're accelerated programs that use job placement as a strong selling point.

It's almost like they rose from the death of many for-profit technical schools (DeVry, WestWood et al.) and took their ecological niche. But in contrast to bootcamps, these for-profit schools were too big, slow, and ineffective.


in the business college.


Higher education is used as a proxy for intelligence by companies who are hiring because they are not allowed to use IQ tests. So I think it kind of makes sense that our society is confused about the purpose of higher education.


It's not just an IQ test, it's a test of ability, which is different. Your college record shows that you

a) Whether you were able to get the work done. b) Whether you were smart enough to do the work well.

It's the combination that's important, not either single one. Employers want people who show up on time, actually care enough to try, and are smart enough to figure it out.

The college degree has a better than zero correlation to that. After you get the job it's mainly about performance and output, once you've been promoted a few times your college degree mostly doesn't matter.


I agree with all of that, though spending 4 years to make that determination seems crazy to me. A competent employer can figure out which hires are worthwhile in about a month or so.

There’s another issue with using educational attainment as a proxy of intelligence: it doesn’t really help you distinguish among candidates who all have a bachelors degree. All you know is that they’re above some threshold, but not how far above. You can use the prestige of the school as a proxy for this, but it all seems like a hell of a lot of hassle when you can figure this all out in a half hour with an IQ test.


IQ test tell you how good someone is at pattern recognition, which is a major factor in abilo to learn. It doesn't tell you what they have learned, or if they have the criticality important skill of perseverance/grit. In fact, high IQ can reduce grit, since clever pattern matchers use their cleverness to avoid working hard on the toy problems of childhood.


My personal take on this is that high IQ quickly turns into a liability in childhood. Particularly when entering the school system.

I believe the words "bored to tears" summarize the experience quite well.

While the majority of pupils get a standardized curriculum designed to keep their interest at a steady pace, pupils with a high learning capacity get no such thing. If they try to learn faster, it doesn't fit with the governance and management model within which the teacher must operate. Therefore, most teachers are at a loss when faced with these statistical outliers.

Also, other pupils may experience emotions of inadequacy and unfairness when a fellow pupil just blasts through the material in minutes that would take them all week. This may lead to the high capacity pupil being a target of some unfortunate group dynamics.

Since schools have no governance model for this, the high learning capacity pupil's school experience is essentially unmanaged. At a loss, society almost invariably resorts to platitudes like "No need to feel sorry for them, because they are so fortunate to be smart. We must focus on the pupils that struggle."

A few years down the line, the pupil's inner motivation may be completely replaced by depression, self-blaming or worse. Then the platitudes take a turn for the worse with blaming the pupil's willingness to work: "In fact, high IQ can reduce grit, since clever pattern matchers use their cleverness to avoid working hard on the toy problems of childhood."

Fortunately, my own school days are long since gone. Without blaming any person in the system, I can say: "Good riddance!" to this whole pitiful affair of how society treated me as a child.

Instead, I can draw attention to this problem by saying clearly that these are children that never asked for these gifts in the first place. Let's as a society realize that what we are doing to them is absolutely wrong and woefully irresponsible.

Fortunately, at least one western country has political attention on this right now. I will work hard and with "grit" to make sure that my experience and observations can help in creating new policies. In particular I wish to address how to practically leverage the pupils' own drive to learn without incurring social/peer stigma or ridiculous costs.


This is a huge problem.

We are perfectly happy to identify athletic talent and enact systems so that the athletically talented are matched up with peers who share their gifts, but we mostly refuse to do this for intellectual talent.

I was fortunate enough to attend a program that handled this pretty well. Basically it was 120 kids from 11-18 who skipped high school and attended university together instead. It’s called the Early Entrance Program and it’s at Cal State Los Angeles:

http://www.calstatela.edu/academic/eep

The issue I face is that I am raising my children in Portland, Oregon, so this program will not be an option for them. Our personal approach is to home school instead. And I’ll probably investigate starting something like the Early Entrance Program at the state university in Portland here.


It also shows that you could do that for several years, following the rules set by someone else, and passing.


Nice double meaning with 'passing' also meaning something like: "able to appear to be what those in power want me to be."


I would not interpret it that way at all. Being able to understand requirements and implement stuff based on those requirements, or work as part of a team, or maintain dedication to a task for weeks or months at a time, has nothing to do with teenage notions of the world being populated by conformists vs. nonconformists (which ironically is usually based on very conformist notions of what it means to be a nonconformist).

On the other hand, I have known a number of people who used their self-identification as a nonconformist as an excuse for why they couldn't cut it someplace, whether it's school or a job or some interpersonal situation. It's a bad excuse because it gives the person an easy out for not reflecting on how they could have handled the situation better, or avoided getting into the situation in the first place. It also pushes all of the blame on other people, which is comforting but very rarely 100% true. We should all be our own worst critics, lest someone else beats us to it.


For better or worse, it _is_ a proxy for many things. But one that i think is most relevant to a typical job is the ability to see something that is sometimes fun, sometimes boring, and sometimes challenging through to the end. Typically staying put in one place while doing so. Its not intended to be vocational training, but in a sense it matches very well with what's expect of the typical employee: Show up consistently, deal with stress and various people and tasks with an ability to see it all through, without quitting or pissing too many people off. If you can cope well and make it through undergrad, you can likely handle a vast array of jobs.


> because they are not allowed to use IQ tests

There's actual employment law that prohibits using IQ tests? I'd never heard that before


An IQ test can be used during the hiring process but the employer has to be able to defend its use as directly relevant to the work at hand. So most employers aren't willing to spend the legal fees related to defending the use of a direct IQ test... Possibly because the college degree is a good enough litmus test for IQ to satisfy their needs.


If it's difficult to prove that IQ is relevant to the job... Maybe it's not. Also, I doubt that's the reason.


It’s not difficult to demonstrate that IQ is relevant to job performance (its the best predictor of performance for most jobs). The problem is to demonstrate that it doesn’t have a discriminatory effect (there are large, persistent differences in IQ between races).


There's already an imbalance when it comes to different genders and races, I don't get what they would lose.

The real reason companies don't do it is because Fizz Buzz is a better test than IQ tests.


Source on that first claim?



It's related to anti-discrimination / "psychology-test" laws. If there is any sort of "test" given as a condition of employment, it must be relevant, standardized, non-biased, reviewed by specialists and approved, etc.

http://cdn8.openculture.com/wp-content/uploads/2014/07/Test1...

The value and definition of "IQ" is widely disputed and generally seen as of limited relevance to the actual work and performance of that work (uncorrelated). Look it up and do some research if you're interested, this is just a quick showing of _why_ tests may be considered bad.


One day I wonder if some lawyer will strike it big by making an argument that looking at universities is discrimination because e.g. family connections and wealth give you a big leg up in getting into certain elite schools...


Yeah, but most universities engage in affirmative action, which theoretically negates this.


The linked image you shared is one I've seen before that was a Louisiana state voter literacy test used for voter suppression. It was disproportionately administered to black voters[1]

I'm not saying that IQ tests are useful measures of ability to do a job. I just wasn't aware there was actual law or precedent forbidding their use.

> If there is any sort of "test" given as a condition of employment, it must be relevant, standardized, non-biased, reviewed by specialists and approved, etc.

Wouldn't this disqualify a lot of whiteboard interviews too? Most startups don't have a standard question bank reviewed by experts or grade on a standard rubric.

1. http://www.slate.com/blogs/the_vault/2013/06/28/voting_right...


Whiteboard tests are directly relevant to the position being hired for and, thus, okay to use.


The classic "how many gas stations are there in the USA?" type of interview question seem exactly like an IQ test and not at all relevant to job requirements.



If IQ tests were a reliable measure of any job capability, surely the brilliant minds at Mensa would already have made a mint by selling recruitment services.


See this commment elsewhere in the thread: https://news.ycombinator.com/item?id=15828513


Well, part of the problem is that mensa is a total crock of shit.


Whatever it is, it's a group filled with people who score very high on iq tests.


You may not know this but a primary purpose of Mensa is to help "smart" people find other "smart" people to socialize with. In that sense it succeeds (I'm a member).


Define "crock of shit?" It's just an organization with membership requirements, by that logic organizations such as the VFW are a "crock of shit."


For fuck's sake, cut out the middle man and just request SAT / ACT scores.


I'm canadian so have never taken these tests, but I was under the impression that a large component of scoring well on them is cramming?


Not really. I think of the SAT as a great equalizer since a brilliant kid in rural Nebraska will do well and be able to go to an elite college, while a silver spoon heir will spend thousands and maybe improve their score but certainly not max it out.

Not to toot my own horn but I got a perfect score on the math part of the SAT with little prep, and I've tutored others on the same and only managed to eke out a little improvement. (In contrast, I've tutored math course work and had more success there.)

The math isn't very advanced, it's just a matter of doing it quickly and accurately.


> I think of the SAT as a great equalizer

The truth is more nuanced [1].

[1] https://www.washingtonpost.com/news/wonk/wp/2014/03/05/these...


From a psychometric perspective, they are IQ tests, in that they show the same correlation with IQ tests that IQ tests show with each other.

From a research-into-today's-hot-button-issues perspective, they are routinely slammed for being about cramming, but the effect of test preparation on test scores tends to be very close to zero when you look into it.


I did very well on both tests in high school without much cramming (I skimmed one study book from the library), and scored something like 90th percentile when I took the ACT in middle school.

Whether this is correlated to anything important in real life is an entirely separate question.



You're correct.


I did well, but I also took it with a severe migraine. Can I take it again now?


>Higher education is used as a proxy for intelligence by companies who are hiring because they are not allowed to use IQ tests.

I think you're wrong about that,

http://abcnews.go.com/US/court-oks-barring-high-iqs-cops/sto...


Your link is to a story about an employer who uses a test which is a proxy for intelligence (they even give the rough equivalency between the test they use and IQ). I think that supports my point that intelligence is one of the more important things employers are looking for. Most employers don’t have the resources to develop a hiring screening test and then certify that it is not discriminatory, so they use educational attainment instead.

Another way to approach this is to ask yourself: if not as a proxy for intelligence, then why do some jobs require a college degree?


Still wrong.

https://www.hiresuccess.com/blog/is-employment-testing-legal

>I think that supports my point

I didn't say it invalidated your point. I'm trying to draw your attention to a specific area in your statement that is wrong. You can simply say,

>Higher education is used as a proxy for intelligence by companies

That's enough. Nobody will argue with you about that.


IQ tests used for hiring do exist in the US - https://en.wikipedia.org/wiki/Wonderlic_test


Yes, and that exact test was the one that is legally fraught: https://www.wonderlic.com/resources/publications/white-paper...


Higher education seems like more of a "class marker" to me.


Or maybe the causal factor for both class and higher education is intelligence. The Bell Curve makes a compelling case that the transition away from agriculture toward knowledge work over the last 100 years has created a society in which intelligence is funneled into higher education and from there into high paying professions.


The Bell Curve is a famous case of quasi-science. You are putting statistical speculation up against centuries of well-documented explicit discrimination policies.


Have you read The Bell Curve?

You may disagree with some of its conclusions (I personally am skeptical of its conclusions about racial differences in IQ), but to call it pseudo science is, I think, unfair. FWIW, Steven Pinker, who is about as thoughtful and level headed as they come, agrees:

https://mobile.twitter.com/sapinker/status/84691284703672320...

And, anyway, the point I was making had nothing to do with racial IQ differences. It was referring to what the rest of The Bell Curve is about (discussion of racial IQ differences makes up only about 25% of the book): how the US society’s transition away from most people working in agriculture toward most people working at desks has created a class structure stratified by intelligence.

It’s actually a shame that the book discusses racial IQ differences at all, because everyone got hung up on that and totally missed what is, IMO, a much more important point. And I think it explains a lot about current politics in the US.


You seem to be rather confused about the Bell Curve.

It's my understanding that the main data presented hasn't been overturned or disproved (In-fact quite the opposite)


Meh, IQ is potential, not capability, to do a job. I don't find this argument convincing. What about all the literal geniuses who have no interest and never studied CS?

At least the graduate can show that he/she can commit to a multi-year long, sometimes boring, project successfully. A high IQ person could just leave your company after 6 months because "they're bored".


I’m not saying employers would hire solely based on IQ. That would be crazy. It’s just a screening measure. But so is educational attainment. Nobody gets hired because they have a college degree, but plenty of people don’t get hired because they lack a college degree.

Also, agree that educational attainment gives you more information about a candidate, but in terms of assessing intelligence, it is a coarser measure than IQ. I’ve hired a few people in my day, and there is huge divergence in intelligence among college grads. For lots of jobs it is useful to be able to distinguish between not-stupid, bright, and ridiculously smart. Educational attainment doesn’t really help with that, but IQ tests would.


People aren't confused. Colleges do many things.

They're also very expensive to attend, and the most immediate value they offer is to get people jobs


I've heard doctors speak of themselves as having a trade skill... albeit one that pays well.


Fascinating point. Is "training vs education" a useful pair of terms? Universities used to be for education, now often its assumed they are for training only. (Bachelor of Science in Hotel Management I come across nowadays - in what?!)

I often hear people talk as if education was something that happened to them, then it finished. (Like learning is something you just do in an institution.) I guess training is more like that, teaching knowledge and skills for some particular task/role/job.


How is learning how to run a hotel not an education? Are all the Agriculture and Mechanical universities not education?


Well, that's the point - those things, training to do a particular job, are now thought of as 'education'.


> And the ugly side of it is, we want clear class divisions.

What? I suspect that it's typically, if not exclusively, those in the upper classes that want this. The ugly side of it is human nature is selfish.


Actually, I suspect that it's the middle class (or middle classes for cultures with more than three classes) that are most anxious about being mistaken as lower class. Actual upper class members don't usually care – they're not relying on university attendance to protect their positions.


I definitely would refer to law and medical schools as vocational, much more so than many other educational programs.

I have known several people who went to medical or law school and every single one them went only because it is a per-requisite to becoming a doctor or lawyer. What is your explanation, that people attend these schools to expand their mind, and then entirely coincidentally, manage to end up in the corresponding career as though by accident?

Get real.


Ugh...I hear this sort of thing all the time and I find it sooo obnoxious.

Look, the reason that all the students go to university is to make money. Like 90%+ of undergrads go there to make money so whether it "technically" should be a monastery of cloistered academics pondering the universe (based on some high minded notion of what learning "ought" to be for) is besides the point. University costs a shit ton of money and people (the vast majority) pay it so they can have a decent life.

And the fact that universities are, on the whole, teaching skills that do not help their students accomplish this means either A) they are actively misleading their students B) university is a class indicator more than a place to learn (I can spend huge amounts of money so I'm probably not poor and black - which sucks for all sorts of reasons) or C) they don't understand that teaching academic CS, as opposed to programming, is not what industry demands (so they suck at what they're selling).

These are the people who, quite literally, are billing themselves as the "smartest guys in the room". If that's the case they can well stop sucking at their bloody job!


There are two things wrong with this perspective.

First, if you walk out of a four year degree in computer science without a strong foundation in discrete mathematics, algorithms, data structures, and at least one of distributed systems theory, stochastic systems theory, control, statistics/machine learning, AI, data visualization, or design, you've stunted your horizons for growth as an engineer pretty badly. I use three of those every day at work. I wouldn't be able to read the research, do the math, or write the software to implement what I work on without a very strong education in fundamentals. The only CSE course I took in University that I haven't made active use of was an advanced matrix computation course. I mean, if you're just shitting out web services and REST APIs or converting COBOL to Java, fine, but that's something you can train a monkey (and in very short order, a computer) to do.

Secondly, we invest in Universities as a public good. You should come out of a four year program able to write well, critically analyze difficult material (not "Catcher in the Rye"), articulate complex ideas persuasively, and understand the world you live in and how to live in it substantially better than when you went in. You might be surprised to learn that with the benefit of hindsight, many people find the time they spent reading and learning to be incredibly useful later in life, as it provided a foundation for dealing with conflict and complexity. If that's wasted on you, then it's wasted on you, but understand that you have shot yourself in the foot in terms of your ability to mature as a human being. I personally think public University education should be substantially more rigorous and should fail people aggressively, and should also be free. I wish we separated the whole "whoo look at me I'm 18-22 let's party" thing from the thought of University. If that's where you are in life at 18, there's nothing wrong with that... go work as a code monkey or a barista and get it out of your system, and when you're ready to level up as a human being give University a try, go. That's more or less what worked for me, anyway.


>You should come out of a four year program able to write well, critically analyze difficult material (not "Catcher in the Rye"), articulate complex ideas persuasively, and understand the world you live in and how to live in it substantially better than when you went in.

What a great comment. I think that's also missed by schools where the focus is on memorization and/or trivia. It would be great if they focused on the areas you talked about, not necessarily directly, but at least that is the goal.


Well, not that it matters to my argument, but I did go to university, and did well (in the academic sense - much of my time would probably have been better spent drinking and networking rather than getting good grades). Personally I think it was an incredible waste of time. Even though I worked incredibly hard in many diverse subject areas, most of what I learned was taught at such an abstract level as to be completely unusable in any practical sense. Almost all of what I needed I later learned on the job as needed. Of all the things you've listed, why can't you just look them up on the internet or hire a tutor at worst?

As far as "leveling up as a human being", books are free - go sit in a library. Hell, grab a kindle; most of the cool stuff that I learned about the human experience I learned from talking with others, staring at the sky, hiking in mountains, and reading free books that people on the internet recommend. None of this requires taking out the equivalent of a mortgage or listening to someone lecture on it. If you want a lecture, those are free online too!

I just get the sense that university is used as a filter to differentiate yourself from people who are "below you" - not as smart, not as hard working, not as "mature as a human being". But the main barrier to get into a university seems to be connections and money, neither of which I have a huge amount of respect for. Many of the wealthiest and well connected people in the world are huge assholes, and many of the poorest and least connected are incredibly wise and kind (and intelligent and hard working). It feels like a way to separate yourself from the unwashed hoi polloi and put a barrier to entry between yourself and other people who could do your job. In that sense university encourages laziness!

Overall, I just get this sense that university is this bullshit thing we all just agree to participate in because, for those who go they get a monetary benefit, even though everyone knows its just this bullshit thing. That monetary benefit comes from being a class filter, a networking tool for like minded rich people to find other like minded rich people, and probably a few other class effects I don't really understand. I thought when I was younger that as I got older that maybe I would get more jaded and just accept bullshit things. It would certainly make my life easier to do what other people do and not fight against the tide. But if anything I've become more altruistic and romantic, I want the world to be more honest and people to do the right thing.

I don't know. I've probably benefitted from it even though I wished people judged me based on other qualities that are more important.


First, you should know that the "barrier of connections and money" you talk about does not apply everywhere. Actually, a lot of universities are free so long as you prove you are smart enough through entry exams. So while it still is used as a class filter everywhere, realize that there is equal chance in a lot of countries to get into any university.

Secondly, universities are not meant to prepare you for a better salary or job. They are meant to give you a particular set of skills and knowledge from a field of work. Finishing means that you usually get access to a certain community with the same specialty. This is useful if you want to keep up to date with your field or publish new findings.

Software companies that require computer scientists usually are just full of it and they forget that there are other universities that prepare people for software engineering or even just programming. You could argue that a course or bootcamp is enough. But we already know they ask for too much when you look at the requirements they have. They aim for the best and then take what they are given.

I think universities are too easy nowadays to let people in. During my father's time, there were around 10 slots for the arts university in my country each year. All of them ended being praised artists. Actually, national news are made when each one of them pass away.

Now, universities just take everyone, especially if they pay for a slot. This caused a shift into perspective. You are no longer hard-working or lucky to get into a university, you are now expected to get into one to prove you are not a complete moron.

This is the actual problem, and it is reflected in your arguments. I agree, you probably didn't need to go to university. Hell, I know that I didn't need to go through mine. However, we were both expected to, and for no good reason.


> universities are not meant to prepare you for a better salary or job

Regardless of what any individual may think a university is or isn't meant for, this is how they are used.

As a high school dropout, I was directly told by managers in more than one situation that all I needed was a Bachelor's and I would be eligible for better positions. There was zero discussion of skills, as it was already understood that my skill level in many areas was much higher than that of the degreed people I worked with (who also weren't using their educations for anything other than as a "door pass" to be considered for better jobs).

The system is broken.


Where the heck do you work? How long did it take for you to get to that position? What was your role as a junior?


> the reason that all the students go to university is to make money. Like 90%+ of undergrads go there to make money

Which is it, "all" or "90%?" I went to a private liberal arts college so maybe that automatically puts me and my classmates in the 10% but I did not think the overwhelming majority of us were there to make money. The fact that lots of students still major in subjects that can be very worthwhile and valuable but lack obvious paths to financial success (e.g. most of Humanities and a lot of Social Sciences) indicates "making money" is not why they're there.

You think universities should basically be trade schools. They're not and they shouldn't be. Trade schools should be trade schools but the U.S. has a dearth of them and many that fit the description aren't good enough.


I really hope we see greater specialization of mid-line universities that don't know whether they want to be an academic school or a trade school. Programming-as-inquiry is extremely difficult due to the amount of effort and math required to produce novelty, and programming-as-trade is much less difficult in comparison as a lot of the systems you're composing together already have their principles worked out. You can work from the manual in the latter case, while you're defining the manual in the former.

Meanwhile a university not knowing whether it wants to be a trade school or a research school has mixed incentives that results in an inconsistent quality of education. You get professors who don't know how to teach and teachers who don't know how to go in depth. By requiring both you've made your staffing requirements that much harder to fulfill if you don't want to trade off education quality.

On that note how have bootcamps been doing lately?


The fact that lots of students still major in subjects that can be very worthwhile and valuable but lack obvious paths to financial success (e.g. most of Humanities and a lot of Social Sciences) indicates "making money" is not why they're there.

"Financial success" doesn't necessarily mean optimizing the total number of dollars in your income; it can also mean being able to do work that you enjoy while having a decent standard of living. If a college degree had no impact on your employment opportunities, how many students would still go?

Trade schools should be trade schools but the U.S. has a dearth of them and many that fit the description aren't good enough.

Which is the point. It's not that universities shouldn't exist, it's that they shouldn't be trying to serve both roles.


> the reason that all the students go to university is to make money. Like 90%+ of undergrads go there to make money Which is it, "all" or "90%?"

Ok come on man. I said 90%+. Don't quibble just to be a prick.


If you can't stick to posting civil, substantive comments, we're going to have to ban you. Nitpicking is annoying but it doesn't give license to damage the site in your own right.

Would you mind reading https://news.ycombinator.com/newsguidelines.html and https://news.ycombinator.com/newswelcome.html and abiding by the desired spirit when posting here?


The name-calling seems a little uncalled for.

I also went to a good liberal arts college and had a similar experience. People studying subjects that probably wouldn't lead to financially lucrative careers were in the majority.


> University costs a shit ton of money and people (the vast majority) pay it so they can have a decent life.

First of all, no one is forced to go to university, so if people pay that much money, it's probably because they feel it's worth it, regardless of its supposed "cloistered academicism".

Secondly, maybe the main problem is that something that is needed (or at least very convenient) to have a decent life shouldn't cost a shit ton of money at all. As is the case in countries like Germany, etc. (if your parent post is from one of those, your reply doesn't even make sense).

Finally, I would rather hire a coder with a strong CS background that one from vocational school. The former may not know the language du jour and the framework fad of the day, but can pick them up easily because they know the fundamentals. The latter will likely adapt much slower.


To be fair, they do a pretty bad job teaching CS too. Your average undergrad program is almost entirely learning about CS, not learning CS. A non-trivial of students get through a CS degree never doing anything novel in CS—a real shame because, unlike math, research-level CS problems are eminently accessible to undergrads.

It doesn't even have to be "real" research: I just expect more than classes which have you regurgitate what you learned to pass. And yet that's all you need to get a CS degree. It's a bit like learning to write by filling in blanks.


Your average undergrad program... CS degree never doing anything novel in CS

Is this different than any other field? I was under the impression that most BS programs do not require novel research. That doesn't come until post graduate courses.


I don't necessarily mean publishable—that's doable but difficult in CS—but novel in the sense of requiring non-trivial creativity and insight within the field rather than replaying things you were taught directly.

An example might be writing a compiler for a language you design with features not covered in the book. It's probably not publishable, but it is doing CS in a way that just following instructions in a book or from lectures to write a compiler isn't.


So personal projects that extend the material you learn in courses and take 1-2 semesters to complete? I think plenty of schools already do that as part of a BA/BS graduation requirement.


You just must have had a bad CS program, my undergraduate degree involved plenty of those types of projects.


I think it doesn't come until a second degree, or sometimes a graduate degree. Not post graduate.


Something was lost in translation here. Graduate degree and postgraduate degree mean the same thing, at least in the US.


I don't think industry needs every CS graduate to produce something novel. The work of many is just building upon the work of others.


"novel = of value" in this case.

Fill in the blank programs ("Here's the skeleton, make it display a list") and 3 projects with < 200 lines that aren't doing something CS specific are not novel, but all too common and wide spread.


A lot of things of business value go like this, for example: go build this UI in React, etc. You don't need the ability to significantly return something back to academia to do that. You just need good coding mechanics.


Not the point: college is for CS. Not for "building on the work of others." You learn the basics that everyone needs to do the advanced work. If that isn't needed/useful then you need a vocational school instead of a CS program.

Making people do continual simple demos and nonsense programs instead of teaching them actual CS is BS. They can be simple programs, they don't need to build huge things, but what they build should reveal and teach them things beyond how to shift text around the screen.


Are there vocational programs beyond those 20-week'ish coding schools? Because those don't seem to equip people with quite enough skill.


> classes which have you regurgitate what you learned to pass

I think that's what school is, no?

I'm not trying to be glib, I think this is just a reality of how education works in pretty much all subjects (typical US academia, YMMV etc.) Can you name a major that doesn't work this way?


I had to write real proofs (tho I didn't fail completely if it wasn't free of errors) for my U.S. math undergrad degree.


Did those proofs have to be novel, or were the proofs of already known concepts?


I'm sure those proofs at the time were novel to the student but obviously not the professor who assigned the problem. That's how the student learns.


Novel to the student up to a point—math students are taught a variety of proof techniques in each of their subjects, theorems that students are required to prove generally fall to very similar techniques.


They have to be novel in my experience.

I have a BS in Mathematics from a mediocre school (University of Arizona) and exams were always "prove this fact (which you have never heard of before)", never "what is the proof of XYZ theorem (which we studied)"


there is so much room for questions, it's quicker to write the proof than to search an answer. And a readily found answer would, for undergrad, less likely be in proof form, or too distinct for specific topics later on. And the instructor might become suspicious and interogate you, in which case, if the material helped you to understand, that's the point.


Nostalgia: I really love the 60s definition of CS at Stanford, which seems more inclusive of practical aspects than the theoretical bent of today:

"I consider computer science to be the art and science of exploiting automatic digital computers, and of creating the technology necessary to understand their use. It deals with such related problems as the design of better machines using known components, the design and implementation of adequate software systems for communication between man and machine, and the design and analysis of methods of representing information by abstract symbols and of processes for manipulating these symbols. Computer science must also concern itself with such theoretical subjects supporting this technology as information theory, the logic of the finitely constructable, numerical mathematical analysis, and the psychology of problem solving. Naturally, these theoretical subjects are shared by computer science with such disciplines as philosophy, mathematics, and psychology."

http://i.stanford.edu/pub/cstr/reports/cs/tr/65/26/CS-TR-65-...


And yet many job listings for programmers demand you have a CS degree…


My first boss once told me that when he looks at hiring, having a CS degree means that you know how to find solution to problems. Being able to code can be learned on the job if needed. Most junior jobs in bigcos are usually code monkey type of work anyway.


Because for many employers "ability to program" is a minimum requirement, not the full job description.


For far more employers, "ability to program" is something they have no idea how to filter for in a pile of resumes, but "CS degree" seems like it might be a usable proxy.


>Because for many employers "ability to program" is a minimum requirement, not the full job description

I think this was the author's point. You state it is a minimum requirement, implying a CS degree is a more challenging requirement. Yet it's not hard to find people with CS degrees who do not have the minimum requirement.


That's because CS grads always want to be pair with their own and put heavy work on "planning" how to do things well rather than starting working on it.


...yet that's not obvious to everyone, hence the inclusion.


Forgot I just had this argument a couple of months ago: https://news.ycombinator.com/item?id=15502022


It's not about that fact. The author argues that programming schools should be vocational schools (while ostensibly presenting a list of objective facts).


The author doesn't seem to be making a normative argument here. Rather he is observing that something many people believe (someone with a CS degree will be a trained programmer) isn't true.


I replied to your sibling comment here: https://news.ycombinator.com/item?id=15827977


> The author argues that programming schools should be vocational schools

I dont see where. The implication is that they are assumed to be contrary to reality. It's supposed to be a list of obvious facts.


The full tweet, from the link: "CS programs have, in the main, not decided that the primary path to becoming a programmer should involve doing material actual programming. There are some exceptions: Waterloo, for example. This is the point where I joke "That's an exhaustive list" but not sure that a joke."

This reads to me that Patrick wishes that the joke were not true, which is corroborated by other comments of his, such as https://news.ycombinator.com/item?id=7761134#7762383

The fact that you think that the article is in any way objective is why it is so dangerous. Ideology is like water. Patrick's assumptions and biases shape how he presents his "facts" and which facts are left out. It's not objective! What he is really doing is presenting a specific market-based worldview, which has certain insights and major deficiencies.

Consider for example "There is no hidden reserve of smart people". It sounds like an actionable insight, but is it true? Is it even falsifiable? What does it mean? And what actions would it lead you to take?

My answer: it reinforces the perspective that smart people are the only people worth considering, that there are few smart people, and (in context) that these people are mostly "generalists" and entrepreneurs. It values "intelligence", probably measured in terms of short-term ability to make money, and leaves out values and self-reflection. It leaves out the possibility that you can learn from anyone (c.f. the story of the $20 fan[1]). It leaves out the possibility that your hiring process could be discriminatory or unwelcoming. And so on.

This, I argue, is a good demonstration of why programmers need liberal arts; technical decisions are never just technical decisions. We must remember to ask who wins, who loses, is that desirable, is that just?

[1] http://cs.txstate.edu/~br02/cs1428/ShortStoryForEngineers.ht...


> The author argues that programming schools should be vocational schools

I still don't see this argument being made in ANYTHING you posted now.

> The fact that you think that the article is in any way objective is why it is so dangerous.

How does this relate? I asked for how you came to a conclusion and have gone off into other topics that you're using to characterize a perspective or ideology. I don't care about what you think or what the author thinks. I care about finding how you jumped to a conclusion based on the article's text.

The handwaving to make your own intellectual assertions (right or wrong) have nothing to do with my issue, in the content of what you originally stated. Good luck with whatever.


Yes. I often explained that CS was more like a math degree than an engineering degree.


An engineering degree won't teach you how to do things, either. I have one. Grads aren't able to build bridges, silicon chips, airplanes, or whatever else engineers are supposed to do. (Ok, I built a radio in my first term. But it was a very simple one.) They just have some intellectual tools that are useful precursors to gaining such knowledge.

You can liken CS to something like Chemistry. Is Chemistry about test tubes and HPLC devices? No. Will you learn how test tubes work when you do a chemistry degree? Probably.

It just happens that the practical skill practiced by CS majors is coding. And coding has this unusually large applicability that wet lab work doesn't. Does that mean CS is about coding? No. But you'll probably learn more about coding doing CS than most other things.


Mechanical engineering, fluid mechanics, electronics, all have a common core of mathematical modeling techniques. An engineering degree is fairly flexible, and one can even pivot into CS. But CS has no commonality with engineering, and a CS degree cannot pivot into engineering.


It wasn't that many years ago that most colleges didn't offer a true CS degree, but rather a math degree in CS. When I was applying to colleges in the early 90s I remember it being roughly 50/50.


Ironically, an engineering degree is also like a math degree.


Meh, at least my (mechanical engineering) degree wasn't.

Sure, you learn some mathematical tools that can help you solve some very practical problems, but it stops about there. I was not taught anything close to creative mathematical thinking. I was never taught the fundamentals of things: the axioms and the theorems on which the tools we "learn" are based on. I can guarantee you no one in my graduating class could read a proof of even modest complexity, let alone write one.

Granted, I did go to a school with a good reputation, but one of a very practical/applied program. Still, I'm pretty convinced that math and engineering are very, very different fields, even if all engineering programs include a handful of math classes.


My engineering classes (Caltech) were nearly all applied math classes.


“Applied math” isn’t very similar to what you learn in a reputable math degree, so the point stands.


No it is not. I studied math and the engineers only went as far as differential equations, basic (non proof based) complex analysis, and so on. There were zero engineers in theoretical classes.


> non proof based

What's that about? Does the lecturer just state theorems without proving them, or does it mean that the students aren't required to be able to prove them?


It means the activity of the class (lectures, homework, exams) is to compute numeric and algebraic solutions according to the methods and procedures you're taught. The outcome of a class is generally that you're able to solve a new kind of equation.

This describes 99% of the math that most Americans (including engineering students) who are not math majors will ever encounter. Proofs may be presented as curiosities, and students may even be quizzed on them, but students would not be expected to come up with proofs they haven't seen before.


Electrical engineering surely is.


An applied math degree


Having attended Waterloo, this isn't very surprising at all. The CS department is part of the Math Faculty.


I would claim there is a significant portion of population for whom empirical teaching of CS would be far more kind and efficient than going through the Djisktraist pure theory route. Even Djisktra began with writing actual programs until he grogged how to think about it and then moved to pure theory.


>Well, that's because CS programs are designed to teach CS, not to be a vocational school for programmers.

And which is why they should be moved out of the College of Engineering and placed into the college that has mathematics. Curiously enough, the only school he mentions where he can reliably know their grads have programming experience actually does place CS under mathematics.

I think being able to program basic tasks should be a requirement for all engineering programs. They all require a course in CS - mostly not because they want ME's to know CS, but they want them to be able to program basic things.

Yet, many/most engineering grads are really poor at it. I feel for any given engineering degree program, we should make all seniors who have declared they will graduate that semester to take a basic exam on very basic concepts that all grads of that program should know (strong emphasis on the word "basic"). If it's acceptable for them not to know basic material in an introductory class, then remove the class from the curriculum!

Example: I was in a top 5 school helping an grad student in CS do basic probability homework. We were disagreeing on the correct answer, so I suggested he write a simple program to simulate it and compare with his calculated answer.

This is the simplest program possible: A for loop, and a random number generator.

He had no idea how to do it.

I'm not picking on him because he's a CS student. I think it's a valid question for people in all engineering programs that involve introductory programming courses. What's the point of making such a course a mandatory requirement if they can literally do nothing based on that knowledge?

Similarly, an electrical engineer who does not know, say, Kirchoff's laws should not be allowed to graduate.

Yes, I do think universities are for education, but if they cannot apply what they learned, they are not educated.


I went to a polytechnic university (i.e. one with a "learn by doing" focus). Our engineering students definitely had to have some programming classes, but they were taught separately from the CS ones. Most engineering students considered them an annoying roadblock in the way of getting back to their "real" work, and I (a CS student) spent a fair amount of time tutoring friends. I'll have to ask some of them if they use any form of programming now, in their day jobs. I suspect I'd get a mix of answers.

As for the CS students, lectures were often on the theory, and homework was usually focused on implementing demonstrations of the theory. A grad student who wasn't a decent programmer was either a cheater in undergrad, or didn't come from our school.


>A grad student who wasn't a decent programmer was either a cheater in undergrad, or didn't come from our school.

This particular student came from a different engineering background. Yet, he still got admitted into a top 5 CS program.

And whether he did CS or not in his undergrad is beside the point. I think this is a task any engineering grad should be able to do. If not, the university should not be listing programming as a core skill they teach their engineering students.

BTW, he is not unique. I don't think many (perhaps 20-30%) of my fellow grads (non-CS engineers) could code something that simple either when they were graduating.


you are being very critical, a part of university education is tolerance, although some opinions differ on if that relates to knowing literally everything. Trade-offs are a part of engineering because optimal solutions don't always exist. I'm saying, programming often is just knowing which functions to pick, which is not an inert skill except for reading comprehension, if there is a good documentation. There's nothing universal to learn there, the interesting bit is writing those functions. But a pseudo random number generator is rather advanced stuff.

I hope you helped out as good as you could to bridge the gap and maybe you were rightfully disappointed if the student hadn't prepared to catch up to undergrad material in spare time.


>I'm saying, programming often is just knowing which functions to pick, which is not an inert skill except for reading comprehension, if there is a good documentation. There's nothing universal to learn there, the interesting bit is writing those functions. But a pseudo random number generator is rather advanced stuff.

First of all, I'm not asking them to know "advanced" random number generation theory. As a kid, using random number generators in BASIC was one of the first things we did when we learned programming - it makes writing programs a lot more fun. Granted, it's not usually covered in introductory programming courses. But all that is needed is one API call. I pretty much told him what he needed to do. I showed him the API he needs to generate random numbers. We talked in big picture terms about the algorithm (set up the experiment, run 10000 times, take the ratio of success over number of runs). We talked about the theory on why such an experiment would answer his question. I helped in every way other than the actual coding. Had he written some code, I would have helped debug (some probability experiments are tricky to code correctly - that is acceptable).

He just couldn't translate the problem to code.

>You are being very critical, a part of university education is tolerance, although some opinions differ on if that relates to knowing literally everything. Trade-offs are a part of engineering because optimal solutions don't always exist.

I definitely am being critical. When you graduate, you get a seal from your institution that is supposed to be a guarantee of something. My question is: What is that something? If you start dealing with math graduates from a university and you find they often cannot do basic algebra, you stop valuing the seal the university provides - which damages all math graduates of the university. They are working very hard to get a certificate that is not valued by others.

Would you trust a physics graduate who cannot do basic calculus? You can argue that he may know all the theories (conservation of energy, electromagnetics, etc), and understands the principles behind calculus (area of curves, rates of change, etc). But if they struggle with basic evaluation of integrals, you probably will wonder about what kinds of courses they taught in his university that allowed him to get a degree without actually solving problems with calculus.

This is the problem the author is talking about. If you consistently interview people with computer science degrees, who cannot do a simple fizzbuzz, you stop valuing the degree - and that is the general landscape in the software world. Those who worked hard to gain a lot of knowledge relevant to both theory and practice are right to be upset that because of lax (or at least misguided) standards from their educational institution, they have to go through more hoops to convince people of their skills.

It goes beyond the simple mantra of "Universities teach computer science, not software." I would question the technical ability of anyone who cannot write such a simple program, with as much guidance as I provided - be they a CS major or a math major. I'm not asking for practical knowledge like whether unit testing is useful, or how to architect large software, or the virtues of OO over FP. Just a basic program. Can you write it?

Certainly there is not universal agreement on the curriculum. I'm saying there should be agreement on some basics. Historically, universities handle this by ensuring some introductory course is in the curriculum. My argument is that in the US the education system is a little too modularized, and there should be more coupling. If programming is seen to be a core skill that engineers should know, make writing code an aspect of every engineering course (it's really easy to construct assignments/projects in pretty much any engineering course that could involve writing code).

Don't let someone get an A in the introductory course in his freshman year, and then allow him to forget pretty much everything he learned from it by the time he graduates. If you do, you wasted the student's time, the instructor's time, and the time of all the interviewers who cannot trust your degree and have to check for themselves if this person knows the basics.

Again, it all boils down to: The university is giving you a degree. Is this degree a guarantee of anything? If so, what? I'm all for education for the sake of gaining knowledge. But how many of us would be OK with gaining the knowledge without any kind of certification from the university? Not many, I suspect. I actually did this personally (dropped out of PhD after many years as I felt I gained the knowledge but did not feel a degree would help me), and can assure you from the conversations I had with others who tried to convince me otherwise that pretty much almost everyone wants that paper to have some value.


There is a well known delineation between applied math and pure math. It seems that such a convention has yet to be established for CS.


Not sure that's an accurate comparison. We make the English majors do a ton of writing, the art majors produce a lot of art, and no one seems to have any issue assigning a lists of problems to math majors as homework.

I suspect there is just a cultural issue, where the CS faculty doesn't want to do that style of teaching. I had a professor hand out debugging assignments because that was part of the industry feedback they'd gotten, but only the one professor listened to the feedback.


Sometimes* the best way to learn things is to do them, a lot.

- Graduate of the North Avenue Trade School

(* Most times)


I think there is a thin line here. For example I was taught I to programm Smalltalk. On the one hand that is not a commonly used programming language and on the other hand it teaches Object Oriented Programming like no other language.

So yes, after that you can't start as a Senior Java Developer but I think you have a very good foundation to quickly pick up other Object Oriented languages.

So you do not have to teach the latest hyped language to give students the ability to programm. Instead you can use clean languages to teach CS and let your students excercise them to show you that they understood what you taught them.


Reminder that patio11 might be saying all these things without trying to imply value. I.e., he's not necessarily saying any of these things are good and bad; the "obvious" part is just that they are true.


My CS education turned out to be “coding in a few fundamental domains” and I was very happy with it. We implemented some fundamental algorithms and components across operating systems, networks, distributed consensus, compilers/interpreters, visualization, sensing and perception, databases, machine learning, etc.

You were free to construct a degree out of proofs rather than source code if you wanted, but those of us who desired a more practical education could fill our days with systems programming.


What's the point of learning CS if you can't write a complete program? What the hell else are you going to do with it? Try to think of yet another way to write a sort algorithm?


Don't worry most CS programs aren't designed to teach CS either. They teach a random assortment of technical material that is neither terribly difficult nor terribly relevant.


Yet not having a CS degree makes getting a programming job 10x more difficult.


> People underestimate how effective a generalist can be at things which are done by specialists. People underestimate how deep specialties can run. These are simultaneously true.

This is really, really important. In my experience, this is not just unknown but actively ignored and disagreed with (either explicitly or effectively) by a large majority of people involved in software businesses. This hamstrings: companies, individuals' growth, and the advancement of the state of the art.


There's also often a messy transition period where a team goes from a bunch of generalists to more specialists. This can be extremely rocky for all involved as often generalists have lots of historic operational knowledge and strong opinions, but the new specialists come with external knowledge of best practices and how to scale.

It is important that historic knowledge is respected for the context it provides, while outside knowledge is equally respected for the deeper context and data behind it.

Unfortunately, this change often means generalists lose jobs in favor of specialists as a company grows if there isn't a good place for them, or if they don't rise to management. In fact, even early managers can be displaced by outside managers who come in with more management experience. And this is not inherently a bad thing for the company (while it may be for the employee getting bumped). The people that take a company from point A to point B are rarely the same people that take it from point B to point C.


Am I crazy to think that generalists are the best-suited engineers to making the managerial move? You can keep a good broad knowledge base sufficient to be useful in early directional discussions, and sufficient to sniff out a lot of (not all) BS artists and identify which specialists to listen to/loop in, in less time than it would take a specialist to do the same in terms of juggling things across team, due to the nature of the generalism skill.

I think otherwise trying to be a "senior generalist" or "staff generalist" is going to be very hard, but for logical reasons.


It's certainly a factor, but I'd argue that by a very long margin, people management skills are the best indicator for engineers suited to making the managerial move.

You have to know what you're talking about and know when to defer to experts, and being a generalist might prepare you well for that. But once you've reached the threshold on knowledge and critical thinking, the biggest barrier to being a good manager is mastering the mechanics of people wrangling.


Being a tech generalist is a great background for a manager. But there are some caveats.

Firstly, besides understanding the work, a manager also has to be able to actually manage subordinates. This is orthogonal to technical ability. Secondly, internal promotion of a generalist carries risk of that generalist being too opinionated based on his own work. They are much more likely to be a conservative force. This might be an unnecessary barrier to good changes.


Great comment. I found these problems in myself as I transitioned into management years ago. I was blocking good changes...

These days I try to just be a tie-breaker for the team, instead of laying out the plan and looking for "feedback".


Disclaimer: I would categorize myself as a generalist so I might be biased here

> Unfortunately, this change often means generalists lose jobs in favor of specialists as a company grows

> And this is not inherently a bad thing for the company

Yes, it is not _inherently_ a bad thing, but I would say that it is _generally_ a bad thing. Sure, if you are raising a Series C and everything is about optimizing the hell out of your product, there might not be a place for generalists.

However, I would claim that if you throw out generalists, you throw out some of the core/most valuable people who could help the company establish new products and help you grow into a multi-product company (diversifying your portfolio is generally a good thing). If the company is only left with specialists it will slowly transform into another one of those industry giants that struggle with innovation.


I am not sure if it as bad as described here. Yes, over time the relative part of generalists in a company shrinks. But it does not necessarily mean, that generalists loose their jobs, just because some specialist knocked at the door.

I think it is more like an organical transition where newly hired personnel are more likely to be specialists. And that is totally okay, because when you start a company there are about 0% specialists onboard ;-)


Late reply, but I think there's a second harm to throwing out generalists in addition to what you outline:

Specialists aren't as good as generalists at integrating the work of specialists across domains (almost by definition). A lack of generalists can then lead to a situation where specialists are all making locally good moves, but the overall direction is negative -- something akin to Simpson's paradox (though, in the other direction).


I think companies need _both_. Specialists can miss the forest for the trees, and at some point you need to go into the last 20 of the 80/20 rule, which requires supplementing or specializing your generalists.

And I believe it's entirely possible to be both; to be a better specialist you need to be a better generalist, and vice versa.


>often generalists have lots of historic operational knowledge and strong opinions

*citation needed

Edit: to be clear, in my experience the specialists that aren't also generalists on some level are the least productive on the team because they have trouble participating outside their knowledge domain. Maybe I have worked at the wrong places though...


Starting dev of some product, you might have a lot of generalists. They make the original architecture and design decisions, do the first implementations of features, etc. It's their baby. Later on, the company identifies laser-focus areas where specialists are appropriate, and you contract or hire some. The original devs (the generalists) will have knowledge about why something was built a certain way, and might react strongly when the specialist advises deep changes to the program to help it scale up.

That's the messy part of the transition: There's kind of the implication that past decisions weren't correct. It's easy for people to get emotionally involved and take personal offense.

Example: Backup product. Design started in the late 90s. By late-aughts, multiple CPUs were obviously a serious thing. New guy was brought in, and set on the project of re-architecting the core program into a more asynch data processing model. 6 or 8 months later, the changes were working for base cases, but the senior devs threw an absolute hissy fit focused on how much of their code was changed. We should've banded together and fixed the edge cases. Instead, we spent years fighting over the structure of the program, and that dev transferred to another group.

Bringing in security-focused employees worked out better. Their changes were more around the edges of the various programs, rather than in their cores. Fewer toes stepped on, fewer disagreements.


>"That's the messy part of the transition: There's kind of the implication that past decisions weren't correct. It's easy for people to get emotionally involved and take personal offense."

Thanks for highlighting something I missed in my post. Some of the time yes, the decisions might just be wrong because generalists may not be experienced enough to know the right way to do things. But quite often, they were in fact the right decision at that time and circumstances change.

So any sort of process and culture that helps take the emotion and risk of personal offense out of those situations is really critical to gaining the benefit of historical context in a positive manner.


To clarify, when a company is young, you have a lot of people wearing a lot of hats. They may in fact be quite good at wearing some of those hats. Likewise, having visibility and exposure to multiple domains aspects of the same company gives a perspective that is hard to have if you live in a super specialized silo.

However specialists don't need to be generalists to be productive as long as they come in with the mindset that they are specialists, and therefore may not know everything. And that is especially important, as I highlighted, for respecting historical context of the company and its operations.

There might be VERY good reasons for not doing something the way the rest of the industry does it (which is what the specialist might be inclined to do). However a good specialist should also be adept at factoring that into how they approach problems, and properly weigh the new information against their past experience to determine if in fact this is a special snowflake situation, or if their approach is still the right one. Conflict can often arise when if they push for the latter, so strong communication skills are key (and that is something that never really changes for any role at any stage of company growth).


I get where you are coming from. For me, being a long-term employee having gone through a few transitions from smallish to biggish, and when your storage guy you hired doesn't really know networking basics, or the database administrator has no clue about Linux, etc. it tends to slow everyone down.

"Scale" often seems to equate with SF hotshots from one of the big names that will come in and airdrop a brand new architecture. These things take years and you need those generalists that have the tribal knowledge and the ability to jump in (almost) as deeply as a specialized developer, DBA, Network Admin, etc. I think the SRE movement is good evidence of the type of breadth and depth needed to operate a complicated system at scale.


IMO that's a problem of hiring the wrong specialists, or BS artists in specialist hats (see the article's point about "the tech industry is not serious about hiring" :) ).


In other words, we have to go to war with the army we have, not the army we'd like to have. Sometimes waiting for just the right fit isn't an option, either because of the need to keep adding heads for growth, and sometimes you have situations like a Director (just brought in from BigDealCo to lead the growth) has a former employee that knows is "perfect" for the job. Director leaves after a year, after installing people that don't have skills. Catch-22 sometimes.


True. I think it's easier said than done to find the cream of the crop with a limited talent pool for many specialized areas.


> Everyone in Silicon Valley uses equity, and not debt, to fuel their growth because debt is not available in sufficient quantities to poorly capitalized companies without a strong history of adequate cash flows to service debt.

A corollary to this is that if you're fundraising and can get debt, you should take it over equity almost every time. Many founders are skeptical of debt because everyone takes equity, yet everyone takes equity because it's all they can get.

At the very least, if you're seriously trying to decide between the two, remember that equity investors want at least a 10x return within about five years. Break out a calculator and look at what the interest rate on that would be (spoilers, it's higher than any lender would think to ask).


No no no. Debt is not universally better. Debt can wipe out companies. It requires payments that equity does not. Miss one payment and you're in default. It's extremely risky for companies that are young and volatile.

Debt vs. equity is always a case-by-case situation. Debt is cheaper but comes with strings attached. Equity is pricey but more flexible.


But it isn't in your creditor's interest that you go bankrupt either.


It is if they lose faith in your ability to make a profit. Better to force you into bankruptcy and auction the IP and tangible assets than let you ride it down to zero.


This. Being able to get debt should, in general, be a good indicator that your company might actually be valuable.


Not really. The easiest way to get a loan is to start a fast food franchise or build houses. Banks don't do innovation.


Bankruptcy can definitely be in the creditor's interest. It can take possession of the company's assets, and then sell them off to recover principal (in addition to many other outcomes - bankruptcy is complicated).

Debt incentives do not align with equity.


I don't equity investments align in interest either - among Founders, angels, VCs, PEs and public shares.

There's enough material to backup the struggle between any of these entities


I don't think debt would be worse for startups though. It's unlikely debt would wipe out a startup unless it continually finances through debt over a long period of time or just has really unfavorable debt terms (which wouldn't make sense for the debt issuer either).

The problem is getting debt financing...but I would imagine debt financing would make sense for 99.99% of startups.


> Miss one payment and you're in default.

I mean this is just factually wrong. Sure you'll rack up fees, it gets expensive quick, and there's generally no good reason to do it, but you're not in default until you're in default - and missing 1 payment is not going to do that.


I wonder if debt is "not available" simply because it's not the standard for SV. I would bet that companies like Slack, Stripe, Stitchfix, etc could have raised considerable amounts of debt instead of raising money through their last rounds.

It's essentially a self-fulfilling prophecy.


The general rule is don't start a company with your own money. I've heard this several times by people who have done it. When you incur debt, you have to put up collateral (i.e. your house). They won't just give it to a new company.

I assume you don't start up a company with your own money is that it can fail and personally bankrupt you in the process. Because of that, it would hinder your judgement? Not sure. Maybe someone can fill me in on that.


> The chief products of the tech industry are (in B2C) developing new habits among consumers

This was a profound revelation I didn't realize until I read it just now. Facebook, Google, Amazon, etc. have all done so well by starting a habit for their users. ("Check Facebook daily to see what your friends are up to." "Google it whenever you need info." "Buy your stuff conveniently on Amazon when you don't want to go to the store.")

I think this approach can work great for even non-tech industries. It definitely worked for Starbucks ("Gotta get my morning Starbucks").

I think lots of industries are ripe for "habit-ifying", and I'm going to pursue that further. Brilliant insight.


    > I think lots of industries are ripe 
    > for "habit-ifying", and I'm going 
    > to pursue that further. Brilliant 
    > insight.
This is, of course, the same insight that the gambling and tobacco industries and cults thrive on.

Not talking you down - just reminding you never to forget the moral angle.


Is it a moral issue if the habit a starts a virtuous cycle? Like my Apple Watch making me exercise more?


It's pretty important to be constantly evaluating the ethical implications of what you're producing. I can't speak for larger companies, but I get the impression that ethics in software development is not nearly as emphasized as it should be. Even the tech guys/gals should be highly concerned with how their product affects their users' overall well-being (or the well-being of those around their users). In fact, they're a critical layer in protecting against unethical business practices given they're the ones actually implementing things and mediating the user's experience on a ground level. Small features in your product can have significant repercussions on the lives of many users given how these things scale. What could be a few lines of code to you could be something that millions of people end up being influenced by hundreds of times a day.


Yes, it can still be a moral issue (I'd prefer to use 'quality of life' rather than moral, as the term moral implies a certain level of judgement of the choices of others, and I'd rather be focused on making those judgements over my own choices).

Use of technology can both help and hinder us. On the negative side, we can become dependent on it, and that dependency does not always enhance our lives. Even in the case of your Apple Watch making you exercise more, used in moderation you can get the benefits without too many downsides, but it can also become a device that distracts you from the world around you, disconnecting you from experiences that you could get by being more fully in the moment. To be clear, this isn't an issue solely limited to use of technology, but the more devices we have and use the more easily distracted we become.

The ideal for companies is to have all our life's needs and wants fulfilled by companies. If they can do so by suggesting there's a hole you need to fill by buying their products, and you become unhappy if you don't have it, is this a healthy trend? What if you were generally happy before this marketing campaign started?


Such "habits" are customary behaviours for getting a task done. The Innovator's Dilemma sees this as key feature of sustaining innovations, that might be technically revolutionary, but don't require any change in customer behaviour. It is an economic moat or enduring competitive advantage for them.

But when an innovation requires a change in customer behaviour, no such advantage applies, and the gates are open to new entrant. The book calls this disruptive innovation (and is the source of that term).

The companies listed all become successful due to such innovations.


It's not actually insightful. Every product trivially creates a habit amongst its users of "when <problem>, use <product>."


The insightful part for me is that you can get better or novel products in areas where the "when <problem>, use <product>" is the dominant habit by optimizing or changing the habit.

Some more examples:

- Keurig coffee makers fundamentally changed the coffee habit for many people.

- Netflix fundamentally changed the "going to the video store" habit.

Making coffee and watching movies were already habits for many people, but it was possible to change the habit and create something revolutionary. I'm sure there are other new untapped habits that can be developed around existing solved problems.

It's a variant on "solve a problem people have". Make a solved problem better by optimizing the habit.


Is there any product that doesn't fit this pattern?


There are some products that serve a need that doesn't occur often enough to become habit forming. For instance, there are apps/services that help you when your car breaks down. Marketing a product/service like that is a fundamentally different proposition from something that's intended to be used daily/weekly/monthly.


I'm not sure. That's my whole point. If you can find a product that doesn't fit the pattern you can be successful by adapting it to fit the pattern. Alternately, if a product you develop doesn't fit the pattern it will probably not be successful.

(Good questions, by the way! I don't want to accept business baloney as wisdom.)


many of the failed ones


I've just finished reading "Hooked:How to build habit-forming products", a very good book that explores this subject in a lot of detail.

If you're interested in learning more about this stuff, I highly recommend reading it.


Slightly off topic, but I didn't realize that this was a series of tweets (a thread) that was extracted and re-presented as a coherent essay.

This explains the somewhat "weird" cadence.

On topic: it makes a lot of sense.


Oh, that explains why it's so weird. I thought it was some kind of extra-terse blog until I saw your comment.


This should be titled "Condensed snippets of accumulated wisdom, focusing mostly on software and business, that I am fairly confident of and feel compelled to share with you".

Many of these nuggets are not "obvious", for any conventional understanding of that word.

That said, I enjoyed the post, and liked the concise format.


"obvious" is not a property of a statement itself, it indicates a relationship between a piece of knowledge and a larger knowledge base, where one thing immediately follows from other things that are already known.

I don't think "obvious" in the conventional sense is a real category, because no two people would agree the same things are obvious.


Ah, just like the good old "it is trivial to see that a -> b" from advanced mathematics.


The way I took "obvious" is that these things are obvious to people who have been dealing with such things for years to the point that they don't realize they aren't obvious to those that haven't.

I've noticed that this is a common thing - it's very rare for someone who has spent years in a particular area to sit down and distill things that have become obvious to them over time in a way that can be taught to newcomers.


This was mentioned by someone else, but in case you missed it: the "concise format" is a result of this being a 23-tweet tweet storm that the Threadreader has compiled into one page for us.


The format is really good. Tired of paying ~$5 for a 300 page self-help book, when a 2 page summary like this can do just as well.


you should consider buying The Great American Bathroom Book. Despite the kitschy title, it consists of rather good and concise summaries of novels as well as nonfiction books, including some popular self-help books. None are more than two pages.


They should hire you to come up with titles.


This one had me stop

>The explosive growth of the tech sector keeps average age down and depresses average wages. Compared to industries which existed in materially the same form in 1970, we have a stupidly compressed experience spectrum: 5+ years rounds to "senior." This is not a joke.

I agree that it compresses the experience spectrum, but has the opposite effect on wages. It's tough to find any industry that pays more for the amount of experience/education.


Tech does not pay for experience.

A new grad's compensation is often equal and some times greater than someone who's been in the trenches for 20 years. I don't know any other profession with this absurd inversion. And with a few notable exceptions, the idea that someone with 5 years of experience is Senior-anything is ridiculous. I remember myself at 5 years out, I didn't know shit compared to today. Yet, adjusted for location-cost of living and inflation, I'm making today about what I made back then. The compensation plateau is very real, I assure you. You'll probably find out in 20 years or so.

EDIT: There also wouldn't be so much persistent discussion about ageism in tech either, if experience was valued/compensated.


I've definitely been part of a hiring process where "over-experienced" gets thrown around a lot. Here's stuff that comes up all the time:

- The main and probably unfair assumption that they will want more money and special treatment for the same amount of grunt work.

- They will not respect the (majority 20-something) peers and will try to subtly enforce seniority (again, unfair).

- They will refuse to do grunt work and only want to work on high level problems. We have enough chefs, we need line cooks.

- Veterans can come well-seasoned, but they can also be old cranks who are stuck in their ways and can drag a team backwards. It takes experience to tell the difference (irony).

- If they were truly experienced, they would be getting poached or going through a headhunter. If they are experienced and on the market, they must be damaged good. (This one is truly unfair, but it justifies people relying on paying the premium for a recruiter).


> If they were truly experienced, they would be getting poached or going through a headhunter.

Huh, is that very common?

As someone with 10 years of experience but very little job-hopping (perhaps to my detriment) I always had this feeling that recruiters were a bit like dodgy car salesman, and that a direct application would be better.


There are advantages to the recruiter too (for instance, it's easier to h ave frank conversations about questions like "how much will they pay?" very early in the process) and some recruiters are better than others. Of course your incentives aren't quite aligned and you need to be aware of it, but it's not necessarily a bad move to try using one.


Good recruiters are excellent networkers who are in touch with the most exciting businesses in your area and field of expertise. I've only met one but if I ever find myself looking for a job then I'll be talking with him.


What I meant is whether there is really that much of a bias against experienced direct applicants.


> Tech does not pay for experience.

Nor should it. You should be payed for the type of work you do and there is only so much an individual contributor can do. However, at big companies, there are many levels and compensation ranges by a factor of 10 from the lowest level to the highest. You can make those big pay bumps but you have to perform at the required skills for the next level for which there are only so many available positions.

And from what I have experienced, while individuals can stall in their career progression if they don't pursue the responsibilities required for the next level, the higher ups are extremely correlated with experience.


My impression is that coding requires more energy and enthusiasm (so, in other words, internal motivation) to do a good job than many other professions. Hence 5 year exp "senior" can do a better job than a jaded and complacent 15 year exp one.


I don't know about that. Surely expertise can make up for "energy and enthusiasm" (even after decades nobody knows everything about programming). It sounds a little like a post hoc justification of age discrimination.


In my impression, people who have been coding for 5 years already have plenty of experience, esp. if they are enthusiastic about learning. They may not be team lead material yet, but they should make a fine individual contributor.


I don't mean to deny this, but to say that they know all there is to know and will never get better... I don't think so.


I think he's arguing that the average wage in our industry is low simply because the workforce leans so young, so a disproportionate number of software engineers are still junior—and making junior wages—compared to other industries where the age curve is more spread out.


> the average wage in our industry is low

Is this a joke?


There are a lot of literalists in this thread.

The average wage is dragged down because the age distribution leans so far to the left. That is, the average wage disguises the wages made by staff/principal engineers, which can be 3-5x the comp numbers commonly tossed around for SV engineers.


They aren't low compared to wages in other industries, but I've heard it repeated over and over that they're low compared to the value that software devs create.


It's probably true. There are more CS majors now than ever in the history of the US.


Ah. That makes more sense.


It's tough to find any industry that pays more for the amount of experience/education.

And in five years you'll be making about as much as you're ever going to make. I believe that was at least part of the point.


As someone with 25+ years in the same area of (non-web, non AmaGoogBook) engineering, this is totally the case.

The wage curve goes flat and nobody offers anything above the median for increasingly large years of experience and battle scars.


How much has your work differed over the years. FANG companies have many levels because they work at such a large scale that they require people that lead other developers at all different levels (the orgs with the super high level engineers are also massive with hundreds of engineers). They also grow so fast that their teams are constantly expanding so they can support the additional promotions.


That's why you need to start a company ;-) It's also why there are so many startups.


Is this really a bad thing compared to all of those fields that have people working in them making less than software developers even after years of experience?

Someone five years into a developer role at Google is probably making much more than someone with fifteen years of experience in say, retail or nursing.


Don't forget hours and environment. I made 6 figures out of college for a relaxed 40 hour work week in an air conditioned building and i get to wear sandals to work everyday.

There is a reason software developer is consistently ranked as one of the best possible jobs.


"Your idea is not valuable, at all."

If execution is all that matters your idea isn't good enough.

I don't care how many times people keep claiming that execution is all that matters, it's still wrong or it becomes tautological, like saying. The key to success is being successful.

Good ideas are often shitty ideas until they are good. What makes them good isn't just execution but timing, luck and so many other factors.

Failing to understand that is a bigger problem than the few people left who think the idea is everything.


I always understood it to mean that a good idea doesn’t sell unless you execute well on it, and also that an idea doesn’t need to be unique, it just needs to be executed better than the competitors.


But that is also not always the case. If you are first to market you don't need to be executing better than the competitors since they don't necessarily exist.

It's way more complex and subtle than the "execution is all that matters" or the "my idea is secured in a password protected powerpoint" camp.

I have helped over a 100 startups and at least in my experience, it's not more correct to say execution is all that matters than to say ideas are all that matter.

Plenty of well-executed ideas don't ever go anywhere because the ideas were wrong, to begin with.

In the context of this otherwise fine list by Patrick this is just particularly problematic because it's wrong in a non-obvious way but sounds right.


> Plenty of well-executed ideas don't ever go anywhere because the ideas were wrong, to begin with.

Or, to put it another way, the founders made an MVP from their initial (for argument's sake, poor) idea, and then couldn't find product-market fit before running out of (bootstrapped or VC-funded) runway. Your argument, then, is that some ideas and MVPs make it impossible to find product-market fit, no matter who the founders / executive team are, because there is no market or whatever other reason beyond the startup's control.

And you would be correct if it wasn't possible for startups to pivot. But they do, sometimes multiple times, and sometimes successfully. Part of the reason why execution is everything is because good executives can take what the startup has learned about its product and its market, understand that the startup's current trajectory leads to bankruptcy, and make the necessary pivot which presents better odds at viability.


Startups pivoting is because their original idea didn't go anywhere. Plenty of startups pivot and never get anywhere.

Again this obsessions witch claiming execution as somehow being in opposition to idea is what's wrong here.

Both are important.


>Your idea is not valuable, at all. All value is in the execution. You think you are an exception; you are not. You should not insist on an NDA to talk about it; nobody serious will engage in contract review over an idea, and this will mark you as clueless.

John Carmack is unusually direct and asserts pretty much the same things a couple of weeks back. [1]

>This will never, ever work. Nobody will build your game idea and give you money for it. If you go through the development process, you will find that the original idea is a very small fraction of the value.

[1] https://twitter.com/ID_AA_Carmack/status/931653803199275008

Edit: typo s/in/is


Ideas are a multiplier of execution — https://sivers.org/multiply


I agree and disagree with this one. Of course, no one is going to pay money for your game idea, but there are ideas which require an NDA. Those ideas typically take years of development and a person with such an idea would be able to present hundreds of pages of alternate designs and their pros/cons. For example, if I have a design for a 3mm x 1mm mechanism that can actuate a rod by 2mm and I know what kind of power consumption said mechanism will have, that design might be expressible on a single piece of paper, and be worth millions of dollars.


>a person with such an idea would be able to present hundreds of pages of alternate designs

Arguably, that's much more than just an idea - it's the idea plus the development of the idea, with most of the value being in the development bit.


True, but if I know that you have been working on the idea for 5 years, and that you have those hundreds of pages of alternate designs, then even just the end result may be valuable enough to me that you would want an NDA.


I think the quote is talking about the original idea before work has been put into it, not an end product


But a sketch isn't an end product. It is not a "realization".


>Companies find it incredibly hard to reliably staff positions with hard-working generalists who operate autonomously and have high risk tolerances. This is not the modal employee, including at places which are justifiably proud of the skill/diligence/etc of their employees.

I'm not sure if he's saying these hard-working generalists are hard to find, or undesirable to employ?


Both. If they're hard to find, then it doesn't make sense to model your organisation in such a way that it relies on such people. Which in turn means that if you find one, you're not going to hire them.

As a corollary: I'm a developer who likes talking to and working with customers, doing UX research, etc. However, since the idea (which may or may not be correct) is that most programmers would rather work by themselves or at least have a project manager or something shield them from the customer, I'm unlikely to be able to find a job in which I can combine programming with customer interaction, even though I'm convinced that this could be valuable. But: only if an organisation is amenable to it.


Could someone elaborate what specifically is meant with having high risk tolerances?


Most employees look to own parts of larger projects where the returns are modest and a low potential failure rate.

Most manager look to own projects that may return a modest multiple of their budget with a 80% chance of success.

There are few executives that will target an opportunity that is 100x their budget with a 20% chance of success.

To the company the third group of employees are much more valuable than the first and second because the returns on the winners vastly exceed the cost of the losers. Individuals however are cognitively biased against personal failure so it is hard to find individuals that will swing for the fences (and who can also convince people in group 1 and 2 to work for them).

Good companies create frameworks where it is okay to fail (i.e. doesn’t impact your job, advancement prospects, etc). Unfortunately that is difficult and it generally is easier to hire sociopaths (no joke).


I wouldn't dream of speaking for Patrick but I believe it's something along the lines of:

unreasonable deadlines, non-existent documentation for existing systems, non-existent QA processes, unavailable/non-existent support staff, undocumented/unclear requirements, etc.


It sounds like the person that tolerates bad management well, wont try to change things so that they are better managed and simultaneously is autonomous enough to manage own responsibilities well. I see how last one is a bit in contradiction against first two. Such people exists, but are indeed rare and hard to recognize.


Yeah, if you're an employee you might care about things like that but some people do consulting, you know.

I would add: some organizations make fistfuls of cash and don't have much in the way of a technology team. Others pump billions of dollars through Excel macros. Not every company has a great IT team following all the latest buzzwords.


I'm also confused. What's a "modal employee"?


I assumed that was in reference to the mode of a distribution - the most common type of employee?


I concur with jpm_sd, he's referring to the most common employee (the "mode" of the set).


I believe it's a typo, should be "model" employee.


No, it's referring to the statistical mode, the most frequent occurrence in a set, "typical" would be a reasonable synonym there.

It's a roundaboot way of saying those qualities rarely all come together.


If you are attempting to hire for an engineering position, greater than 50% of people who apply for the job and whose resume you select as "facially plausible" will be entirely unable to program, at all. The search term for learning more about this is FizzBuzz.

So true, and something that stumped me when I sat at the other side of a hiring table myself. I couldn't figure out why until it was pointed out to me that people who are good at their job are not looking for work, they already have a job. In fact, the really good people often don't apply for jobs at all, but instead they get convinced to switch employers. By contrast, the people who can't program are on the job market all the time. So, the job market is heavily skewed towards people not good enough to hire.


>By contrast, the people who can't program are on the job market all the time. So, the job market is heavily skewed towards people not good enough to hire.

People should be allowed to leave the industry for some amount of time without being marked as damaged goods.

In fact, based on how poorly technical interviews have evolved, I have no faith that a hiring manager wouldn't just blindly ignore all candidates with an employment gap on their record after reading posts similar to yours, and that really is a damn shame. It's not a humane practice.


I didn’t mean to imply employment gaps, just that people who can’t program do a lot more job interviews than people who can.


> Charge more. Charge more still. Go on.

This might sound stupid, and maybe it's my inner hippie talking, but did this make anyone else slightly uncomfortable? I get that it's "good business", but something feels kinda gross about saying "extract as much money from other people as you possibly can." Like... shouldn't you just charge whatever you think something (a product, your time) is worth? I realize that "what something is worth" is often subjective, but charging more just because you can feels morally ambiguous to me at best.

(Then again, if you're talking about extracting money from a corporation, that feels less gross. But of course my inner hippie would say that.)


Patio11 talks about charging more a lot: http://www.kalzumeus.com/2012/09/21/ramit-sethi-and-patrick-... http://www.kalzumeus.com/2015/05/01/talking-about-money/ https://training.kalzumeus.com/newsletters/archive/consultin...

His argument basically is that, businesses/consultants/employees fail because they charge/bill/demand in salary negotiations too little WAY more often than they fail for charging too much. Therefore, you should charge more than all of the decision making biases in your head say you should charge.


I interpreted that point very much together with the preceding point:

> Technologists tend to severely underestimate the difficulty and expense of creating software, especially at companies which do not have fully staffed industry leading engineering teams ("because software is so easy there, amirite guys?")

That, in my experience, is very true. Teams are staffed to "we can make it work, maybe" not "we can actually engineer something"; the second bullet, to me, says, "you need to charge more than you think you do, because you're underestimating the costs of doing it, in particular, of doing it well."


The problem (other than your inner hippie :P) is that it can be quite difficult to accurately value a software product. Most physical goods are valued based on the cost of materials/manufacturing. However software is an intangible good without those fixed costs. The closest thing is hosting.

For B2B solutions, value is more tied to how much it would cost a customer to implement your solution instead of buying yours. But even a startup founder who's been an engineer for many years might not have a clear picture of how much the previous projects they worked on cost their employer. They only see their own hourly rate or salary (which already might be low) and don't see the total cost of development, the cost of infrastructure, personnel for ongoing support, etc. Engineers notoriously under-price startup offerings because they lack the broader perspective to accurately value their creations.


mini aside but products are never based on the price of materials as if you follow the chain all materials start as free. They are all priced starting at what people are willing to pay for the labor to take/prepare/transport the free materials. maybe that helps see that enginering is no different


I get where you're coming from, but my point was mostly that when a business develops a product, upfront engineering is often seen as somewhat of a sunk cost. The more important measure is the margin on the physical product. The cost of raw materials (and shipping, manufacturing, etc) is far easier to quantify.


Taken with the rest of his comment, the meaning here is to ensure that you build a viable, sustainable business.

The norm for techies is to underprice their offerings (based on underestimating the difficulty and expense of creating, selling, and supporting software). This leads to even "successful" companies closing because they weren't built to be financially sustainable.

One way of making a company sustainable is to charge a sustainable price (which is typically "more" than initially apparent).


He's not writing "extract as much money from other people as you possibly can". He's making a subtler point, similar to Hofstadter's Law, in that you're not charging enough, even if you know that you aren't!

So, the point isn't too extract as much money from other people as possible. It's too charge more than you'd otherwise be comfortable charging because, even when you do, you may still be under-charging relative to the actual costs required and value provided.


Yes. I think it's related to the Pareto principle, in that the last 20% of a project takes 80% of the time.


I took it as something more like "people are probably willing to pay a lot more than you think they are." Like you said, pricing is subjective. Sometimes people are willing to pay more than you thought, and they still feel they're getting a good deal. Other times, they pay because they feel they have no other choice. Perhaps there's some wisdom in telling the difference.


> shouldn't you just charge whatever you think something (a product, your time) is worth?

How exactly do you, as a programmer without knowledge of the employer’s or client’s business, determine what a piece of software is worth to them?

You have no idea. So you use various heuristics which are more easily available but misleading, like “what are other people being paid?”


How much it's worth to whom? Presumably the customer? (If you're doing it based on how much it's worth to you, that's more like a hobby than a business.) How can you know how much it's worth to the customer? One very simple way is to increase the price until they no longer want to buy it.


Charging whatever YOU think it's worth is where your reasoning is flawed. The point of charging more is so that you find out what the CUSTOMER thinks your product is worth. However, I understand that the brevity of the sentence makes it seem like the author was asking that you hike the price day after day until your clients start refinancing their houses to pay you, but there are more tasteful ways to increase pricing which is what he was referring to.


> shouldn't you just charge whatever you think something (a product, your time) is worth?

If you’re paying yourself, then sure, go ahead. But if you want someone else to pay you, then it should be based on what they think it’s worth. Why should they care what you think it’s worth?


Assuming perfect information and rationality, you can't charge more than something is worth, at least in the long run, so there's no conflict.


Sometimes charging more is a signal that you're serious and will increase demand, contrary to elementary ideas about demand curves.


I don't believe companies want "hard-working generalists who operate autonomously and have high risk tolerances." I mean obviously, yes, they would claim to want that, but that sort of employee is fundamentally incompatible with modern business. Modern business is predicated upon the idea of employees being as replaceable and interchangeable as possible. It is a common idea presented in many management books that if an employee becomes important to the company, they should be fired immediately. The thinking is that if your company grows in reliance upon this person, they might one day ask for more money and when you fire them at that point the damage done will be worse so it is better to nip the problem in the bud and bite the bullet and get rid of them when you notice their importance and deal with the smaller amount of disruption. Any hard-working generalist who operates autonomously will, by necessity, be building systems which they have all of the domain knowledge about. And despite companies wishing to believe domain knowledge is not valuable, it has substantial operational and financial impact when lost.

>The hardest problem in B2C is distribution.

This confuses me. Distribution is trivially solved. Anyone can get a product to a customer for quite low cost anywhere in the world now. Is something else meant by "distribution" aside from 'getting things from point A to point B'? If I had a warehouse in my back yard and a customer wanting a product from it, I could ship it to the customer just as easily as Amazon could. Their advantage isn't in being able to distribute things. It's in aggregation and many other different things but not distribution. Distribution was the most valuable economic activity of the past century, but it's commoditized now.

>Weak-form efficients market hypothesis is a good heuristic for evaluating the public markets

The efficient market hypothesis says that profit is an error, an inefficiency which will rapidly go to zero as it will be stripped by underpriced competitors. This is not a good heuristic.


> The efficient market hypothesis says that profit is an error

No it doesn't. It's a pricing hypothesis. It says nothing about the presence or absence of profits.


> Charge more. Charge more still. Go on.

I find this advice unhelpful. In the past, when I was making $23/hour as a web developer, people would occasionally ask to hire me for freelance work. I've given various answers, from $30 to $100, and haven't had any takers yet. The only freelance work I've actually done so far was when I said, "what do you think it's worth?" and my friend said, "I dunno, 60?" It was for about an hour of work so I said yeah, but I had already done the work by that point anyway.

$100/hour seems like the right amount to me, since my company was billing me out at $116. But no one else thinks so. I'm pretty sure there's more to it than just "charge more".


I think this piece of advice is extremely relevant for those selling B2B software/products which comes in bigger 'chunks' than your labor which can be sold by the hour.

The gist is that while you (as the average B2B startup) can estimate how valuable your product is to your average customer, you will most likely vastly underestimate the cost of replacement for such a system. In other words, if your sales pitch goes well and they like the product but think to themselves "How much would it cost us to build this ourselves?" you will underestimate this cost from the outside. This is because the cost of replacement is a reasonably good proxy for a price ceiling and by being aware of it you can get a much higher average selling price.


Then you are not competitive enough or not marketing to the right people.

You might think that there is a market for $15/h skilled developers, but there probably isn't. There is a huge market, however, for NewStartup CEOs looking for dev at $15/h.

They'll keep running in circles until they find something on their budget, or they don't. But they'll not raise prices. Beware of those.


your company markets, has sales people, and manages every part of the transaction from signing the contract to developing the code. I don't know if all that is 77/hour, but it might be.


I like this format/premise a lot... it allows you the freedom to say things you find insightful without the risk of some know-it-all coming along and telling you it's obvious. We should do more of this -- nearly everything is going to seem obvious to someone else, but those people should just keep their mouth closed and keep looking for something they didn't know.


>The amount of money flowing through capitalism would astound you

I really like this one. During my biology days, there was a fellow PhD student in the lab who had spent 15+ years in corporate IT before joining academia, and he always had great stories of the incredible amounts of cash we should expect to see flow around us should we embark on careers outside of academia (he was not wrong).


Especially after reading the comments on this thread, I'm really not sure HN is a good venue for discussing tweet streams. I think if Patrick wanted to write something that would spark a discussion on HN, he'd have written a blog post.


Are you sure HN is a good venue for discussing anything?


No, that's a fair point.


Which venues would you recommend for good discussions? Honest question


I for one had no idea this was not a blog post.


That's definitely a part of the problem.


Can we change the link to this: https://twitter.com/patio11/status/936615043126370306

(so folks understand this is a tweetstorm and not a post?)


I actually think this interface is a better way to read a tweetstorm, it could just do with prefixing the title with "A tweetstorm by patio11" to remove the confusion.


> There is no hidden reserve of smart people who know what they're doing, anywhere. Not in government, not in science, not in tech, not at AppAmaGooBookSoft, nowhere. The world exists in the same glorious imperfection that it presents with.

This is painfully true. I so often hoped someone will explain some Linux stuff to me but... strace it is :)


I think the main point of the quote is that nobody knows everything. There are pockets of knowledge, and there are specialists who know quite a bit about Linux. ;) You just need to find that person and not expect them to also know about something completely different like Windows or Asian-Pacific geopolitics.


> Most open source software is written by programmers who are full-time employed by companies which directly consume the software, at the explicit or implicit blessing of their employers. It is not charity work, any more than they charitably file taxes.

There are perhaps 1000 OSS projects that make up the software you use on a daily basis. Many of these have core developers or maintainers who are hired by a company to write or maintain that software for them to consume.

But there are literally millions more open source projects, almost entirely developed by hobbyists.

Github alone has 24 million developers, with 67 million repos (25 million of which are public), and 1.5 million organizations, in 200 countries. The top languages used there are JS, Python, Java, Ruby, PHP, C++, CSS, C#, Go, and C. (Github, btw, is not the only repository of open source code)

Most open source software is most definitely not written by one kind of dev. But there _is_ an underlying motivator behind almost all open source: "I need this code." As long as there is someone who needs it passionately enough to take it upon themselves to craft and produce it, without regard to commercial gain, it will stay alive. Open Source is basically woodworking, and our computers are the chisels and saws.


> Your idea is not valuable, at all. All value is in the execution.

This is wrong, and I'm surprised it's repeated so often. Some ideas are better than others. If you doubt this, ask yourself if, when starting a company, you would choose a random idea or your best idea.

Of course you would choose your best idea, which implies ideas have different value.

EDIT: I'm happy to debate this point with anyone who would otherwise downvote me. Critical replies welcome.


I don't think it necessarily means "all ideas are the same". You're right -- some ideas may be better than others, but until they're executed successfully, they're just ideas.

ie: There were multiple ways that Google could have executed their goal of building a search engine (engineering, sales, etc). Even if you come up with a 'better' idea, it may not be successful/valuable if you execute it poorly and make the wrong decisions.


Except google's idea wasn't "search engine". Google's idea was "search engine that uses the PageRank algorithm". PageRank did not encompass all of the value of Google, but it did encompass much of it, and it was unequivocally part of the idea.

What you're effectively saying is that ideas have no value without good execution, which is of course true, but this is very different than saying they have no value, because after all, execution has no value without a good idea.


If the value is in "search engine that uses the PageRank algorithm", then theoretically you can take that idea and execute it to get the same value.

The difference isn't in the idea but the execution - hence the value.


I'm sorry, I don't follow.

EDIT: Oh I see what you mean now. By "you", you actually mean me. No, I couldn't, because Google has parlayed their initial good idea into a monopoly based on network effects, specifically the accrual of proprietary data (clicks) which I have no access to.

The idea was valuable in 1997. It's no longer.


If the value is in the idea, then multiple execution of the same idea should result in the same/similar values. But... this isn't true irl, therefore the differences in value isn't in the idea but the execution


Your error is in concluding that because execution matters, ideas don't matter. But that's erroneous. Both matter.


>when starting a company

i.e. "when executing"

But yes, other things being equal, the execution of a good idea is more valuable than the execution of a bad idea.


To recast what you are saying to remain congruent with the original premise, one might say ideas can have different potential value.


By that rephrasing, execution also only has potential value, since the value is only realized in conjunction with a good idea.

So if you want to keep any useful definition of "valuable", you have to concede that ideas can be valuable.


The hardest problem in B2C is distribution. The hardest problem in B2B is sales. AppAmaGooBookSoft are AppAmaGooBookSoft primarily because they have mortal locks on distribution.

I agree with this, but I think it's interesting. Wasn't the Internet supposed to democratize distribution? All someone has to do is go to your website. And anybody can set up a website.

So it seems like the Internet just shifted the goal posts for everyone? The winners are still as dominant as ever.

But I guess it is true that you can have a break-out hit like Instagram or WhatsApp, and that wasn't possible pre-Internet. The time scales are compressed.

But perversely, maybe because distribution is easier and software flows more freely, you have to be even better? You can't rely on "local inefficiencies"? You have to be globally the best.


> AppAmaGooBookSoft are AppAmaGooBookSoft primarily because they have mortal locks on distribution.

Well, first of all, this isn't true. Google and FB aren't so profitable because of a distribution monopoly. They're profitable due to network effects.


Yes I agree the original post oversimplified it.


network effects: a phenomenon whereby a product or service gains additional value as more people use it.

How does google search get better from more people using it?


To be brief, more data (generated by clicks and queries) leads to better ranking. Better ranking attracts more users. Starting a search engine from scratch without any user data is a big disadvantage.

Basically, you're benefitting from other people who are searching for the same thing as you on the same search engine. (I'm sure that's not the only network effect, but it's one of them.)


I think it's that the goal posts have shifted, i.e. distribution used to be the bottleneck for alot of people but now that it's easy (for internet businesses) there's a new bottleneck.

There's so much stuff on the internet that it's a problem of discovery, which is why Google became so dominant.


I expected this to be an article about general life tips. Perhaps the title could use the suffix “About Startups”?


I think it's about software / programming world and economics, not just startups. Still very niche, not stuff about life relevant to any non-tech person


"Your idea is not valuable" ... no, maybe not some top level idea like "Social media, but more private." But, execution is just a bunch of smaller ideas.


It's not just about ideas. Coming up with ideas is easy. Turning an idea into a viable product involves a lot of work.


> Coming up with ideas is easy.

I think that depends on how stringent criteria you put on the ideas. An idea that maybe, with a lot of work, would be successful might be easy to come up with. An idea that is very likely to succeed, with little work and no investments, might not be that easy to find.


> Your idea is not valuable, at all. All value is in the execution.

Then why did Amazon patent the concept of a drone that will fly to your car and recharge it?


Because patents are valuable. Specifically, blocking others from executing.


Patents first require an idea to be patented, so if patents are valuable, ideas are valuable.


And because knowing patents are valuable, Amazon will freely file for far more than they need, and employees will be rewarded for coming up with them even if they are dumb. This specific patent is likely the second in spades.


Ideas are not valuable. A stable of patents backed up with a team of IP lawyers is.


I'm also somewhat sceptical of the blanket "ideas are never valuable" statement.

Surely there are situations where not many people are going to think of the idea and being first to move on it is very important

...such that both the idea and the execution are important. You can't execute well on something that you weren't able to come up with.


Sure you can. Henry Ford didn't come up with the idea of the car, nor did he invent manufacturing by building many copies of the same identical thing in a factory.

But he sure did execute doing both together well.


To prevent others from executing it better than them.


Without the idea, there would be nothing to execute on.


Why would I want Amazon charging their drones from my car?


Because one day that idea could turn into a product that is valuable.


I agree with most points except this one:

>> Most open source software is written by programmers who are full-time employed by companies which directly consume the software, at the explicit or implicit blessing of their employers. It is not charity work, any more than they charitably file taxes.

Most open source projects don't go anywhere... They have about 3 stars on Github and nobody uses them.

To say that open source is not charity work is quite unfair. The number of people who make money from their open source projects is tiny... Even among popular projects with thousands of stars. The real motivation behind most open source projects is a strong desire to learn and explore new ideas and to share these ideas with others.

The financial stuff might come later but most of the time it doesn't come at all.

Outside of the cryptocurrency space and big corporations, very few people who get into open source do so because they think it is a good way to make money... Because it's not.

I believe very strongly that most open source projects are founded on altruism. The motivation might change over time but you cannot disregard the initial intent.


You're making the mistake of assuming open source means "something you that's not your job."

This is false.

Most open source development is paid work. Many people "get into open source" by accepting a regular day job.

https://www.techrepublic.com/article/for-50-percent-of-devel...


I've done open source work for 10 years. I'm pretty sure I know what open source means.

You can have a look at GitHub; there are 25 million active repos on there; at least 99% of them are not sponsored by any company and nobody gets paid to work on them.

People like the OP and yourself are spreading distorted ideas which harm open source and the software ecosystem as a whole.

Putting open source projects on the same level as commercial software is not right. They are often completely different kinds of people behind them.


You're probably mostly right, but note that there is a bit of a "dark matter" issue here. A lot of open source work is done at the behest of an employer and on the clock but still appears to be a "personal" project.

I have worked on many open source projects that would appear to be entirely the work of individuals, when they are in fact working on company time.


Open source projects are founded on self-interest, not altruism: "I can do this better than any existing solution"! On rare occasions this is true, and on even rarer occasions others agree and start contributing.


BTW, doesn't this app break Twitter's ToS by displaying tweets in a non-standard format? https://developer.twitter.com/en/developer-terms/display-req...


> Meta thought: you radically underestimate both a) how much you know that other people do not and b) the instrumental benefits to you of publishing it.

Can someone explain b) to me?

Is he saying that, assuming you fit into category a, by explicitly revealing your knowledge, your potential opportunities will expand more than you estimate?


Yes.

For instance, if you consistently put out good pieces on your past experiences in the software industry as a PM, then your current experiences as a software business owner, you will gradually become widely known as the guy who gives advice about what you shouldn't be doing in your software career or software business.

This was essentially how Joel Spolsky grew his audience through his blog -- turning his knowledge into published pieces. His blog audience combined with Jeffrey Atwood's blog audience were instrumental to getting StackOverflow off the ground.

You never know, but having an existing audience can be an incredible enabler for getting your future ventures off the ground.


TLDR: Yes. You will benefit from sharing knowledge.

> You radically underestimate (...) the instrumental benefits to you of publishing it (much you know that other people do not).

(He is saying, at least, that) you will benefit (more than you estimate) from publishing things you know (that others don't).


I do not doubt this:

> Most open source software is written by programmers who are full-time employed by companies which directly consume the software, at the explicit or implicit blessing of their employers. It is not charity work, any more than they charitably file taxes.

But I wonder how many important Open Source projects were started that way, by employees of companies which wanted to use those projects. Linux, for example, most certainly was not begun that way.


Yes, I think this sentence oversimplifies things. The conclusions should be divided between:

1) Most open source software written and released today

2) Most open source software you use

The code in category #2 is much older on average.

Most people here use bash every day, but as far as I can tell, there's about one person behind it, who doesn't get paid for it (Chet Ramey).

And you're right that a lot of projects start free, and then get commercial contribution once they are valuable. Linux, git, hg, and Python all fit that profile.

In the "non-company" category, I would put Linux, git, hg, gcc, Apache. A lot of people still use Apache. nginx was in this category too until very recently.

SVN is interesting because it was started by a company (CollabNet), but I believe they diverged and the company no longer had much economic interest in it.

Anyway, I guess what I'm saying is that the story is really complicated. It's not as simple as what's in that Tweet.


> And you're right that a lot of projects start free, and then get commercial contribution once they are valuable. Linux, git, hg, and Python all fit that profile.

I think you used the wrong word here, or at least a misleading one:

Linux, for example, is still free and Free for most reasonable definitions people have of those terms. (Anti-copyleft trolling is not reasonable.) Its development is funded, yes, but that funding doesn't give IBM, for example, special rights which a sufficiently determined individual hacker couldn't also have: Both have the ability to have an idea, code it, and get the code accepted, and IBM isn't immune to a Linus veto just because IBM's poured money into their team of kernel contributors.

I rant, but it's an important distinction between Open Source and closed source. It also reminds me of an old joke: Back in the 1990s, Apple, bereft of Jobs and nearly dead against Microsoft, was moving towards IBM in terms of strategic partnerships. So, the joke went "Apple + IBM = IBM", as in, if you move too close to IBM, IBM engulfs you and you lose your identity. My point is, IBM can't engulf an Open Source project unless absolutely nobody outside IBM cares about it and all the repos outside of IBM's purview utterly stagnate.

> In the "non-company" category, I would put Linux, git, hg, gcc, Apache. A lot of people still use Apache. nginx was in this category too until very recently.

I don't know if you noticed, but you put Linux in two different categories.

> SVN is interesting because it was started by a company (CollabNet), but I believe they diverged and the company no longer had much economic interest in it.

That is interesting.

> It's not as simple as what's in that Tweet.

Things rarely are. :)

Good post!


Yes, there are a lot of dimensions, hence Linux falling in multiple categories. It was started by an individual, but many of the current contributions are by people who are paid to work on it.

The story isn't simple for any big/successful project I can think of:

1) Started by a company for profit, vs. an individual for fun, or maybe profit.

2) Where the current commits come from, regardless of how the project started (contributor is paid or not paid)

I didn't mean to suggest that "free" and "commercial contribution" are mutually exclusive. The distinction I meant is whether the contributor is part of an organization that makes money from the software.

3) Whether the project was forked. WebKit was forked, LLVM was shepherded by Apple into a huge project, but Apple didn't start it. CyanogenMod was forked the other way (from a company to a commercial effort), and then turned back into a different company.

Additionally, some people might start a company to make open source software. And some employees might be paid more making proprietary software, but choose to work somewhere where they can work on open source.

So yes it's very complicated. This quote:

It is not charity work, any more than they charitably file taxes.

isn't useful, except for the very small number of people who think that "open source == free == no profit".


Not sure I agree with these two:

"Salaries in the tech industry are up a lot in the last few years, caused by: a tight labor market, collapse of a cartel organized against the interests of workers, increasing returns to scale at AppAmaGooBookSoft, and the like.

Investor money does not pay most salaries."

But the rest are pretty sage. As one would expect of patio11 :)


"collapse of a cartel organized against the interests of workers" -- to what is this referring?


Agreements not to negotiate past a certain amount and not compete too hard for talent, etc. amongst tech companies to reduce wages. Link:

https://pando.com/2014/03/22/revealed-apple-and-googles-wage...



> Technical literacy in the broader population can be approximated with the Thanksgiving test: what sort of questions do you get at Thanksgiving? That's the ambient level of literacy.

I didn't understand this one (perhaps due to not being American). Can someone explain it please?


Thanksgiving is a holiday where many people have a big family meal. If you have family members who you usually only see once a year, there's a good chance that day is Thanksgiving. So that's when a lot of people tend to field questions from people who 1) care about them enough to ask how their new job is going but 2) have no idea what they really do at work.


I thought they meant that they would ask you to fix their computer.


"""in B2B) taking a business process which exists in many places and markedly decreasing the total cost of people required to implement it."""

thus a big idea search should be on what are common processes. "reading english" seems to be the next big one


Your idea is not valuable, at all. All value is in the execution. You think you are an exception; you are not. You should not insist on an NDA to talk about it; nobody serious will engage in contract review over an idea, and this will mark you as clueless.

Love that so much.


Whatever this app is, it would be really helpful to not do the jagged edge thing on the top as it makes it completely unreadable for me. I use the top of windows to line up text and without that ability, I basically cannot read the words.


> This is not the modal employee,

Is this a misuse of the word "modal"? If not, could someone please explain this definition to me? From context, I'm guessing the author means "typical" perhaps?


He does mean typical, but I'm not sure if the definition is correct. Mode of a sequence is the unit occurring most number of times. So the modal employee is supposed to be the most frequently occurring one.


I thought this was going to be a list about "life". I was actually pleasantly surprised it was about our industry!

I would still like there to be a list like this about life though.


I'll have a go. Here's a few that seem almost axiomatic to me but that I think might be somewhat less so to this crowd:

- Words are important. Choose them carefully. Hear them carefully.

- "Tolerating everyone but the intolerant" is no real tolerance at all.

- Only being "free" to hold your views in private, is no real freedom at all. Freedoms of speech and association and religion must be public freedoms if they are to be meaningful in any real sense.

- There's a continuum that exists between equality and freedom. Governments and society can more-or-less choose where they sit on the continuum, but can't move more towards one value without trading off some of the other.

- Prices agreed to by individuals in a free market (read: without coercion) are a statement each individual is making about the value of the good or service being traded. What's a "fair wage" for mowing lawns? What's a "fair price" for a watermelon? How much is "too much" profit for a legal firm? Capitalism is nothing more than a recognition that built into humanity is a desire to trade what you have for what you want, and a declaration that the Free Market is the most economical (read: efficient) way for humans to fulfill that desire. Following on from the previous axiom, you can certainly declare a "fair price for watermelon", but you're removing the freedom of individuals to decide for themselves how they value the world around them.

- Be highly skeptical of anyone handing out pitchforks. And learn to recognize when you're being handed one in a news article, youtube video, reddit comment, political ad, etc., etc.

- When you feel like the world is full of people who aren't changing their values fast enough, remember Chesterton's Fence[0]. In the main, the world around you exists the way it does for a reason (or for a plethora of reasons), and if you don't understand that reason, you're less likely to understand what the right fix is.

- Debt (of all kinds: student, mortgage, business) is a promise you're making about your future. The older, more formal term for a loan is a "promissory note".

- Before you make any promise, consider your ability to predict and control the future.

- Constraints are natural pressures, disadvantages, and discomforts that can be helpful in decision-making. Don't avoid painful situations.

(I think those last three make a powerful argument for bootstrapping startups instead of chasing VC funding or debt.)

- We will always have "bad" laws. There is no one set of guidelines which humanity will ever discover that will cleanly cover all use cases and will be agreed upon as being right by everyone.

- Humans are moral creatures. Whenever anyone uses the word "should" or "ought" in a sentence, they're making a moral statement. Hawking may be a world leading physicist, but when he declared philosophy to be dead, there's a reason philosopher's all around the world shot milk out their noses[1].

- You can't legislate morality. But in a democracy, legislation is the lowest-common-denominator of the morality of the governed. This means that just because something is legal, doesn't mean it's moral. It also means that in a society which encourages diversity of values, you'll have less common ground upon which to legislate.

- Don't act surprised when someone is ignorant of a fact or subject which you know. Help them understand it with humility.

- Don't hesitate to disclose your ignorance of a subject to someone who can help you learn more about it. They may not want to point out your ignorance, and silently pretending you do understand it only keeps you from learning new things.

- Corporations are people. Or more precisely, its impossible to separate out the corporate entity's values and behavior from its owners' values and behavior.

- As much as possible, have your mind made up how you will handle pressure to compromise on your values before you're in a tenuous situation.

- Don't worry about "missing out". FOMO is terrible justification for doing things you wouldn't otherwise do. Though this isn't an argument against spontaneity.

[0]: https://en.wikipedia.org/wiki/G._K._Chesterton#Chesterton.27...

[1]: https://www.quora.com/How-are-philosophers-reacting-to-Steph...


> Technical literacy in the broader population can be approximated with the Thanksgiving test: what sort of questions do you get at Thanksgiving? That's the ambient level of literacy.

At my Thanksgiving table there's two professional programmers, a CS student, and a CS professor. While I'm doubtless an extreme example, my general point is that "your Thanksgiving table" is still a bubble. That said, sure, for many it will be closer to representative than friends/coworkers.


Anyone want to take a crack at expanding this?:

> The tech industry is fundamentally unserious about how it recruits, hires, and retains candidates.


Given the amount of money and resources that goes into recruiting and hiring, I find it hard to believe that the industry is not serious.

On the other hand, the current approaches clearly don't work very well, and everybody knows it, but nobody seems to be interested in seriously thinking about what we need to do differently. (patio11 and tptacek are notable exceptions)


Aline Lerner is interested: http://blog.alinelerner.com/


So her best predictor of future job performance is number of typos on resume. That is my pet peeve when looking at resumes, but it really discriminates against people for whom English is not their native language. On the other hand, didn't have a lot of non native English speakers applying to the positions for which I was hiring back then.


I like the notions of specialists and generalists. Generalists tend to under-estimate specialists and vice-versa.


Can anyone expound on "collapse of a cartel organized against the interests of workers"?



Generally good stuff, some quibbles:

> Your idea is not valuable, at all. All value is in the execution.

Good advice in the main but not 100% accurate. Some ideas are novel and good. In fact, many are. But communicating even good ideas is difficult bordering on impossible, more so for truly novel and innovative ideas. Think about how often an adaptation of an excellent novel translates into an excellent movie. It's rare. But that doesn't mean that novels (or idea) are worthless. It means that execution is hard, it means that communicating ideas is hard. Even if you can spend years turning an excellent idea into a fully fleshed out idea (a novel) it will still be hard to execute it faithfully. And by the same token, most books are pretty mediocre, for example, just as most ideas are mediocre. Trite, repetitive, hackneyed, etc. That doesn't mean all ideas are valueless, just most of them, and it can be difficult to figure that out until you've gotten to execution.

> There is no hidden reserve of smart people who know what they're doing, anywhere.

This is also mostly true but a bit misleading. It's nice to think that everyone is just struggling along half-assing things the same way you are, and that's true. Mostly. But there are actually a lot of people who really do have their shit together, really do know what they're doing, and are just better at their jobs by a significant margin than most people. It's comforting to think that such people are an illusion, but they are out there, sometimes they aren't even doing glamorous work. The fact that they exist isn't really terribly important (if you're really lucky you might be able to hire them, but you probably won't be that lucky).

What's more important, I think, is that there is no excuse not to try to do better, constantly. If you want to really have a competitive advantage don't worry about hiring the top 1%, rockstars, ninjas, or even necessarily the "10x" workers. Concentrate on fostering a culture of continuous improvement and collaboration. Learn about best practices. Apply them. Build policies, processes, and culture that support doing better. Study what works and what doesn't push towards an environment that fosters doing what works and abandons bad practices. Stop firefighting start doing root cause analysis. Do code reviews regularly, get better at it. Test your code, increase automated testing, increase the quality of testing. Keep moving the bar higher in terms of quality. Figure out how to accelerate development velocity. Move error detection closer to the point of checkin. Never give up raising the bar. Decent developers working as a team with a good process focused on continuous improvement will out-perform rockstars stomping on each other's toes all the time every day of the week, by orders of magnitude.


What is the anti-labor cartel he's talking about?



Which cartel is he referring to?



> The tech industry is fundamentally unserious about how it recruits, hires, and retains candidates. About which I have a lot more to say than could fit in a tweet, but, a good thing to know.

Would love to read more about that. I'd be willing to bet most SMBs are terrible at hiring / retaining developers.

My last company did literally nothing to help devs learn and grow (or to reward those who did so on their own), which created a department full of miserable, unmotivated people. The only reason most of them are still there is because they haven't found a new job but can't afford to leave.

They're trapped, essentially.


This is helpful for my Asperger's. I like the part about the coffee money. Thank you!


[flagged]


It's extracted from a series of tweets, that might explain the weird rhythm. And surely there's a less dickish way to communicate this?


Why is it necessary to make comments like this. Is it simply to demean those who have mental health issues and were responsible/brave enough to get help? Why use this as you example? You have many others, even just saying "while on stims/speed".


The author is keeping it pretty real based on a majority of my experience. Doesn't have to be spot on, but it's real.


> Your idea is not valuable, at all. All value is in the execution.

Simply not true. Ask the Winklevoss twins.


Yes, it's evidently, demonstrably, the case that ideas have value, because this is enshrined the concept of intellectual property. Even in a world where IP didn't exist, his statement would be wrong, because there will always be people who can sell an idea, and the value of something is exactly what someone else will pay for it.


That's a near-perfect example of his point.


You are aware, I assume, that the Winklevoss twins were awarded $65 million for "their" idea?


Typos are distracting. I don't know whether "modal" programmers is intentional or a typo for "model".


Pretty sure he means "modal" as in mode average.


Entirely possible, and I thought so too, but then I bumped into a typo later which made me question it. Dunno.


Great, you found a misspelling. But you missed out on all the content. I'd say that's your issue and not his.


He also used "which" improperly (instead of "that") like 7-8 times. Majorly distracting.


In real English, as native speakers and good writers use it, allows "which" as as "that" in restrictive relative clauses (although it no longer allows "that" -- only "which" -- in supplementary clauses). See http://itre.cis.upenn.edu/%7Emyl/languagelog/archives/001464... ("although there is a clear frequency difference between the dialect groups [British and American English], ... both that and which are grammatical in integrated relatives in both dialect groups"). The rule is one made up by grammar checkers and authors of pedantic pamphlets, and if it's distracting to you I suggest you go cold turkey and read fewer of those and more of just about anything else.



Those are exactly the "grammar checkers" and "pedantic pamphlets" that I referred to (Grammarly is literally a grammar checker). Why should we consider them right and Brontë, Dickens and Melville, and their editors, wrong?

That is not purely a rhetorical question. While I do think that Brontë, Dickens and Melville are right and the authors of your links are wrong, I also would like to know why you and many others think it's the other way round.


Not OP, but I wonder if his/her brain is similar to mine: I get a small kick of dopamine from knowing that I do something "technically right" that others (especially famous people) do "wrong."

It's been a long, slow road to recovery.


> Your idea is not valuable, at all. All value is in the execution.

This is oft repeated, but not always true. Take the Nazis (bear with me).

The Nazis had the best execution ever. They almost drowned the whole British army, which had previously conquered the world. While fighting battles all over Europe and being one of the most bombarded country in history, they continued to produce tanks, planes, submarines, operate a bureaucracy until the very end, invent bombs and other systems that befuddled all other nations.

Yet they lost the war. The reason they lost the war is, their idea was so bad, people everywhere were willing to die to make it fail.

An excellent idea can survive a bad execution for some time. A terrible idea will ultimately fail, no matter the execution.


That Nazis were extremely efficient is a myth, which makes sense if you think about how they organized their government/economy. [1][2]

Though they were effective at killing large numbers of (defenseless) people…

Also they didn't lose the war because of some flaw in their ideology, they lost due to manpower and industrial shortages (inflicted by Soviet and Allied militaries) that limited their ability to wage war.

For more, I recommend watching the Hitler Channel (err History Channel). [3]

[1] https://www.washingtonpost.com/outlook/five-myths/five-myths...

[2] http://www.bbc.com/travel/story/20170903-why-people-think-ge...

[3] https://www.history.com/topics/world-war-ii


Terrible analogy. Nazi execution was horrible post invasion of France. Hitler's conservative generals hadn't foreseen how effective their blitzkrieg tactics were at the beginning of the war. That justified his own delusional view of his own greatness and Nazi superiority, even though he & the Nazi's had nothing to do with the success, it came from german armies brilliant strategies/tactics/training/soldiers and preparation.

So then Hitler stopped listening to his generals, and went on to micromanage the war, and made one terrible decision after another.

• Not invading Britain before starting a war with the USSR, where Germany was outnumbered by over 3-1.

• Focusing the Russian invasion n Moscow instead of the far more important strategic objective of Kursk's oil fields. Not only did Germany needed more oil, but Kursk was also nearly the USSR's only supply. They got close to Kursk, but didn't destroy tthe oil fields when they fell short, and Germany's war efforts was handicapped by oil shortages the rest of the war. Your air force can't use it's superiority if it can't leave the ground, and neither can your tank commanders if their tanks can't start.

• Sending a million man army to attack Stalingrad when it had little strategic value, then abandoned it, losing every single man. When you are outnumbered 3-1, you can't fight wars of attrition, every man counts.

• Rejected the Army's plan to liberate Ukrainians and recruit them to fight as allies against the USSR. This would have almost evened the man-power disadvantage with a fresh supply of soldiers who hated the USSR. Instead he declared them sub-human, ordering them liquidated to make "living space" for Germany, and turned Ukrainian resistance behind German lines into a devastatingly powerful asset of the USSR.

• Ordered the ME-262, by far the worlds fastest fighter, to be launched with bomb mounts that slowed it substantially so it could only be used as fighter bomber, while Germany was being bombed relentlessly and had lost command of it's own skies.

• Wasting huge amounts of resources on wonder-weapons such as the V1, V2, that had almost no impact on the Allies war efforts.

• Also includes the Maus and Tiger Tanks, which sucked resources away from building more of their best tank, the Panther.


Most of these examples are strategic mistakes, not execution. Execution means, the ability to do what you set out to do.

Not focusing on the right objective is not, IMHO, an execution problem. It's a vision problem, a leadership failure.


Hitler personally controlling movement of units on the eastern front, causing massive traffic jams and misallocations of fuel was clearly an execution issue.

But much of what I wrote above was also an execution issue. Making oil fields a number one priority was an obviously correct decision for the army generals. Not being allowed to do it because of diverted resources was an execution issue.


Or they lost the war because of bad execution. They had feifdom fights in the upper ranks and Hitler wasn't able to have anyone tell him he had bad ideas (e.g. war on two fronts and locking tanks in Paris instead of letting them be commanded as needed during d-day (which again came down to infighting)).

Germany was also being worn thin on supplies. It's not as if they were in perfect shape minus allied bombings.

Also, the Nazi ideals weren't terrible in the way a business idea was terrible. Many people bought into it.


You clearly need to re-read John Gill.


You're going to have to flush out your meaning more. Are you implying Nazi Germany didn't make quite a few large mistakes that are often pointed to as how they lost momentum and failed to repel D-Day?


> Significant advances shipped by the tech industry in the last 20 years include putting the majority of human knowledge in the hands of 40%++ of the world's population, available on-demand, for "coffee money" not "university money."

Oh, I didn't realize tech invented the library card.


As cool as libraries are, and they are cool, if you value your time at even minimum wage rates, they are not "coffee money". That's why you... yes, you, even you... whip out your phone to answer questions that you would never dream of running to the library to answer on the spot. This is a real effect, not an illusion. Friction matters, a lot.


Again, all these things people are saying are indeed true, but they also make the university comparison in the original quote something of a red herring.


I'm not sure you, or any other of the other commenters in this thread, quite understand what was meant by "university money". "University money" is what it costs to buy scientific journals; in theory anybody can do it, in practice only Universities have both the need and the funding to do it. It isn't that your phone is a university... it is that it has all this information accessible for so dirt cheap that African farmers can get at it, modulo the language barrier.


Interesting, thanks for the clarification.


Libraries are not run with "Access to Information" as their main goal. They're run with service to patrons as their main goal. If you think I'm making a distinction without a difference, how many libraries have subscriptions to Playboy?


> how many libraries have subscriptions to Playboy?

It might be a difference, but it's a difference for which the university comparison is a red herring.

How many universities have a porn department?


Universities can study porn under the heading of the humanities or social science. The fact they don't have an explicit "Department of Pornography" is both an accident of our culture and a reflection of the fact our "top-level categories" for academics are more abstract than that.

My point is, libraries maintain a very curated collection, and choose their holdings and resources based on very restrictive criteria, including funding and shelf space, making them relatively poor at keeping broad and deep holdings. Until you give the library an Internet connection, which brings us back to the origin of this sub-thread.


I think we're both way off topic if jerf's description of the intended meaning of the quote is accurate.

But I have yet to be in a university library that stocks back-issues of Playboy...


It would be interesting to determine how many people in the world have access to a library, how many people have internet access, and how much overlap there is between the two groups.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: