Hacker News new | past | comments | ask | show | jobs | submit login
The Prosperous Software Consultant (2018) (dabit3.medium.com)
162 points by mooreds on June 23, 2021 | hide | past | favorite | 79 comments



One great thing about offering training and workshops is that, especially if you can partner with another company to handle the sales, marketing, online platform, etc., online courses built from this material make a great first "product" to start transitioning from a purely service-based (get paid for your time) business to one with some sources of recurring revenue (get paid for work you've already done).


I concur with a lot of what Nader has laid out here. Specialization + Content are keys to unlocking much higher pricing power.

Getting started is tricky, but my most profitable side work has been workshops and lunch-and-learns, which is beneficial as I can re-use a lot of the same content between engagements.


I agree as well. You can be a "cloud consultant" or you can go all in on AWS costs, have a newsletter, keep your face in front of people, etc. Guess which one probably does better.

Of course it has to be a specialization that enough people need to be willing to pay for.

You also have to keep a clear lookup for a new area or areas if your specialty is in decline. You don't want to be the Y2K mitigation expert in 2001. Or the top performance expert for some legacy or discontinued computer architecture.


We started with lunch-and-learns, but eventually moved to beer-and-briefings.


Both the above are better than dining and whining which folks disillusioned from modern software like me do.


Can you elaborate critical difference between -learn and -briefing or does it just rhyme better :) ?


If you're not already familiar, I think you'll be interested in the concept of alliteration.

https://en.wikipedia.org/wiki/Alliteration


One contains beer.


It just rhymes better!


> Flat Rate Pricing

> This also applies to software development. If an e-commerce application you are building will bring in $500,000.00 in sales in the next year, then charging 10% to 20% of this amount is acceptable. The same goes with feature implementation. If your feature will save them $200,000.00 in the next year, price with that in mind vs the hourly rate.

And how do you know what amount of money they'll make or save? Do you estimate it yourself? Do they tell you?



I was about to write that a guide to estimating savings or income from partial information would be really useful, then it occurred to me that, since businesses & management do that a lot, this is probably exhaustively covered in business & management literature.

Maybe some case studies of how to do this specifically with the kinds of info a software consultant (or, indeed, a low-ranked development employee—all are advised to attach a dollar value to their work for later bragging/self-promotion purposes, after all) may readily access, then, would be enlightening.


It seems to me that the person who researches how to do this for software development, then packages and publishes the information, is themselves producing a valuable (and therefore saleable) piece of content.


Just ask. A good client thinks of you as a trusted business partner and will want you to be fully informed.

Maybe you can help them make or save even more.


i’m always amused that the answer is always “work harder” and “do more shit”. meanwhile, in other consulting professions, people say “no”. why is this so difficult? the only remaining solid piece of advice here is to specialize. i can confirm this works.


Were we reading the same story? There was a long section about how higher rates are much more important than hours worked, and how you want to move away from getting paid for hours billed (butt in seat) to value added (irrespective of time taken).


the article mentions a lot of extra work and projects which will take half if not more of your time, to make a name for yourself. that is what my comment is about.


also, charging by project rather than by hour will only work if you know really well how much time something takes which can be very dangerous IMHO.


This is a pretty good post. I've also done pretty well consulting. Some quick notes.

First: $210k/yr is not necessarily really good money for a US software consultant. When consulting full time, please try to keep two things in mind:

(1) Your cost basis is higher than it was when you were a W2 FTE. If you're making $100k/yr in salary, your employer is paying substantially more than $100k/yr to keep you on staff; you have a "fully loaded" cost that includes not only infrastructure stuff like computers and office space and Google accounts and training and vacation and sick days, but also your benefits and a pretty substantial chunk of taxes, and a bunch of tax planning stuff that your W2 hides from you. You're now on the hook for all of that.

(2) More importantly: employers are on the hook for the fully loaded costs of their employees indefinitely. Well-run companies hire developers with the expectation of keeping them on staff with no fixed end date (run don't walk from any that don't). Which means that the decision to hire a freelancer versus a full-time employee is not simply based on rate; it's also based on the fact that the freelancer comes with a guarantee that the relationship can be severed the moment it's no longer valuable. That guarantee has a lot of value; "double your FTE rate" isn't even stretching it. If you're giving that up for free, by working at a rate comparable to what you'd be making in a good job, you're doing it wrong.

I don't think it ever makes sense to work hourly. I've written a ton of posts here about why that is; here's a link to the one people seem to like the most: https://news.ycombinator.com/item?id=4103417

What I can say with almost 10 years remove from that post is that I was if anything underselling my position on this. When I was beating the drum on not doing hourly work, I was at Matasano, and we had a day rate (you couldn't buy work for us at increments less than a day). After that, we started another consultancy, where our minimum billable increment went up... uh... substantially from that. You can do week rates, and not sell in less than 1-week increments; you can do month rates; you can do more than that.

The classic dumb argument about hourly versus not tends to devolve to debates about the pitfalls of fixed-rate work. I don't advocate for project rates (I'd do a project rate, I guess, if it made sense; I'm not religious about them). Rather: I think you should provide your customers with a proposal for a total cost for the project based on an estimate of the number of days (weeks, months) you think it'll take, and a SOW for a T&M project with available prorated overages if it takes longer. Then do your best to deliver according to your estimate; if you blow the estimate because you screwed up, eat the overage; if you blow the estimate because your customer didn't get you access to the systems you needed to work on until 3 weeks after the kickoff, they eat the overage. Nobody has ever pushed back on me for this.

When I spelled this out on HN back in like 2010, people responded as if it was black magic. I think what's really happening is that people who run serious consulting firms just don't write a lot of HN comments, because I know of lots of big firms that work this way.


>if you blow the estimate because your customer didn't get you access to the systems you needed to work on until 3 weeks after the kickoff, they eat the overage.

This right here, I once waited 3 months for a company to get their stuff together for me to work on it. I had other projects going on, but you should be paid for waiting time as well, because you won't be taking on as much other work if you expect to get a huge project any time now.


> day rate

Didn't the clients expect holy 8 hours in this case?

Another class of freelancers/consultants which are missed out from this discussion (hourly rate vs other): long-term/indefinite flex-time workers. No estimations or promises, just pay-as-you-go for several years or more. With the hourly rate, in this setup, one can work 5 hours one week, 0 hours second week, 35 another week... So a kind of very flexible employment arrangement disguised as consulting.


Clients expect the work they plan to get done. They don't care about the hours you spend unless you demand they care about it, by running an hourly meter.


One thing I’ve always been curious about is why law firms so aggressively meter. Wouldn’t most arguments you’d apply to software consultants apply to them as well?


It's a good question, and I really believe the answer is just that they've been given social permission to gouge (at least, if you're not big or savvy enough to work out a fixed-fee arrangement) that other professions don't really have. Everyone expects a law firm to nickel-and-dime. And, of course, you're really wary of that when you take calls with your lawyer.


> Everyone expects a law firm to nickel-and-dime.

It's something they are expected to do for their clients, so within reasonable limits doing it to their clients is expected and shows a serious professional attitude.


Not all do.

For example a defense Attorney defending a DUI charge will likely be a flat fee. They will also charge according to what it is worth to you.

e.g. If you were Palo Alto's most successful real estate agent, and were at risk of losing your drivers license, you will be charged accordingly when you show up at the doorstep of a prominent defense Attorney.


^tptacek speaks wisdom here. If you have a client giving you a hard time about your time units, that's probably a client who will be more trouble than they are worth. When I was still doing consulting, I got to a point where I steered every client toward monthly retainer billing because I was tired of the ones who wanted to count every hour rather than focus on outcomes.


>Didn't the clients expect holy 8 hours in this case?

It's your day, not theirs. Accounting for number of hours spent in a day on client work is just getting back to hourly charging.


That is when agreeing on some fixed nbr of days, i.e. fixed price. In case it is indefinite contract without upfront estimations, day would be expected to be a full day, i.e. back to hourly.


I'm not really sure what the discussion here is about. If you sell someone 5 days of your time, starting Monday, and Wednesday is a billable day, then, yes, you need to spend Wednesday working on that project. In the sense that you can't spend half the day billing on someone else's project (you'd be double-booked, which is a no-no), sure. But nobody is looking to make sure exactly which hours you worked on a given billable day; that's just not how it works.

For the most part, this stuff is honor system.


Consider a rolling yearly renewable contract, with no upfront estimations whatsoever. Just a continuous stream of work. The hourly billing allows "guilt-free" flexibility. I can work 2 hours one day, 0 another day, 6 yet another day, etc...


I know these are just examples, but the article keeps mentioning specialising in progressive web apps.

As a user I want to use PWAs.

As a developer I prefer to program PWAs.

But as a contractor, nobody wants them. Everyone is happy to use react native and spend hours/days debugging their build each time they do a minor upgrade to XCode.

I could advertise myself as a PWA specialist. But I've yet to see evidence that anyone would care.


That's the thing. You are supposed to specialize in market demand skills. You could be God mode engineer in computer hardware design. But it seems there is not enough traction there right now.


There is an element of sales involved, you need to have a strong value proposition and sell that to your clients. If your approach is so much better than React Native, tell them what it gains them. How much money do they save, how much more performant is it, let them know how important their site speed scores are and then show them how a PWA fairs against a typical React Native site and so on.


All good points about the marketing, billing and presence on social media. Worth reading for the folks who want to be in consulting side.


I'm going gray and have been thinking about going the freelance/consultant route. Problem: Getting started without having to go to freelance sites and charge considerably less than what I can get as a FTE.

I make a very comfortable salary these days, but I worry about only having a single income stream.


Freelance sites means competing on price (I know, not really, but pretty much). LinkedIn has worked great for recruiters bringing contracts to me, along with keeping in touch with other contractors and them sharing work with me. This might depend on your location, industry, and area of expertise. But it's kept me humming since I went independent in 2015.


Avoid those freelance sites like the plague. It's a race to the bottom.

Other places to look:

    * Ask former colleagues if they need any off-hours help.
    * Ask people you meet at meetups if they are looking for help. (If you attend meetups.)
    * Look at the 'whose looking for a freelancer' posts here.
    * Look at fast growing companies in your area or network and ping folks there. They may want only FTEs, but they may also be open to contractors.
    * Look for part time opportunities. For example, there's a big need for training in certain technologies (AWS, k8s, terraform), and that can be a nice base for a consultant.
Another thing I've done is leave a job and offer to contract back for a time. This hasn't been a long term option for me, but is nice to bridge income.

If you are doing any moonlighting while employed, check your employment contract and ensure it is allowed.


> Specialization

> One of the best pieces of advice I’ve acted on is deciding to specialize.

How do you choose what to specialize in? I have experience with quite a few things, but I don't know if it feels the same once I'd be specialized in it.

The author himself seems to be an expert in React + React Native + GraphQL.


I don't think specialization in this context means framework or even language. Someone who is really good at JavaScript is of very little use as a consultant for a business. They want someone who knows the business. So I think specialization in this context either means knowing a lot about the infrastructure - e.g. being able to do whatever your client needs in AWS quickly and in a cost-effective manner - or, knowing a lot about the business domain, whether that's healthcare, logistics, etc. The code is usually a byproduct of that and being an "expert" in a JavaScript framework isn't going to sell anything to a hospital system looking for a software consultant.


Why does medium break text selection?


Because they tested on Chrome/Chromium but not on Firefox.

Maybe the selection disappears on Firefox because their custom context-menu is rendered on top of the selection, thereby "obstructing" the selection by a few pixels?


Firefox? Use the C-Alt-R, Luke. (Reader mode.)


Ctrl-Alt-R restarts the browser.

Reader view is toggled by F9.


Uh, my bad. It looks either version- or OS-specific. https://superuser.com/a/1147441/21439


Nice article. Thanks for sharing. Most of it applies to other fields as well.


react native is like visual basic in the 90s


[flagged]


"Back in the day", some of us used to deliberately alternate pronouns for these sorts of hypothetical people. The idea was to help nudge people toward perceiving "gender-atypical" occupations/situations/etc. as normal: "The programmer opens her text editor...".

Since then the world has changed quite rapidly when it comes to norms around writing, pronouns, and gender, so perhaps "they" fills the same role for younger writers.


Large corporations sometimes come up with user personas, which I've always thought are fun to work with. Like alternating pronouns, they're a very reasonable, noncontroversial way to remind people their users might not look like them (busy parents, someone performing the same repetitive tasks all day with your application who would appreciation automation, etc)


I find user personas in your example to be quite "out of the way" and in fact reasonable unlike mere gender "substitution" which I (being a non-native speaker) tend to perceive as some unnecessary (and even off-putting at times) piece of information.

Calling that an "alteration" rather than a "substitution" gave me a different perspective, though.

I'd often find Alice/Bob examples in (programming-related) books, but seeing "she" instead of "he" in situations where gender isn't important wasn't that common in my experience.

Some other commenters pointed out that the practice is quite old in English. My use of English is mostly limited to reading technical information and blog posts so I can only assume that the alteration not being popular is (was?) mostly intrinsic to the technical articles/blog posts.


Native speaker here, and it took me most of a decade of seeing the use of the feminine for the non-specific or neuter pronoun before it stopped being very annoying (it was uncommon outside certain circles, until fairly recently). I still have to catch myself when I encounter it to avoid glancing back over the text to figure out which specific person is being referenced, whom I must have missed in my first reading—IOW, it remains a distracting style, for me.


> I still have to catch myself when I encounter it to avoid glancing back over the text to figure out which specific person is being referenced, whom I must have missed in my first reading

That's funny. Saying "off-putting" I had exactly that in mind.

What? She? I must've zoned out.


Yeah, I don't have a problem with the politics of it, exactly, but it's definitely more disruptive to my reading than favoring the masculine (only because I learned that way, of course—had I been taught otherwise, I'd have the opposite problem) or using the singular-plural form (they/them/their).

Maybe the irritation's worth it for the effect that effort's having on culture. I really don't know. In the end, it's not that big a problem for me, just a little irritating, so I simply hope it's doing something actually-helpful for someone and don't worry about it too much.


Everything does not need to be made part of culture wars.

Why would using she be any different than he? It's like Alice and Bob of oh so many quantum thought experiments but without Bob. If there is only one protagonist you have to go 50/50.


Off topic question for you: I'm an ESL speaker and I notice a trend where some people phrase it like

"Everything does not need"

But in my head it sounds better as

"Not everything needs"

Is the latter not acceptable English? Or is it something regional? In my language, it is customary to negate individual nouns, but some English speakers seem to only ever negate the verb.


Fluent native English speaker here.

"Not everything needs..." is definitely acceptable English, and sounds more-correct to my ears. It would also be fluent to write that sentence as "Some things do not need...".

As a logician, I find the phrasing "Everything does not need..." to make me a bit queasy, and I would prefer if it were regarded as unacceptable.

HOWEVER, it's not unacceptable. For example, a famous aphorism is "all that glisters is not gold" (often diluted, in recent centuries, to "all that glitters is not gold"), which is surely not intended to claim that gold never glitters. So I guess I have to tolerate hearing it the illogical way.


Exactly! English is the only language where I hear such illogical statements. Same goes for the inverted usage of "all but", which imo _should_ mean "everything except for" but seems to mean "almost". I wonder what causes this.


> English is the only language where I hear such illogical statements.

Fear not, the exact same thing happens in other languages. Must be that you just don't hear much, for instance, Swedish: "Allt som glittrar är inte guld." (Should logically be "Inte allt som glittrar är guld", just like in English.)

> Same goes for the inverted usage of "all but", which imo _should_ mean "everything except for" but seems to mean "almost". I wonder what causes this.

That doesn't feel "inverted" at all to me. You just need to interpret the "all" as it must have been intended: "the whole"; "all the way there" -- then "all but" quite logically becomes "not quite the whole way there".

Like, if a piece of software was almost, but not quite completely, finished ==> "the software was all (the way to) but (not quite arrived at) finished."

Maybe it helps to see the similarity with this related form: Almost all the bugs were fixed, but one was left ==> "All but one of the bugs were fixed." (The bug-collective was all but vanquished. :-)


Logically, the two constructions have different meanings. The first is equivalent to "nothing needs", so it excludes the possibility of some but not all things needing, while the second includes that possibility.

However, some people use the first incorrectly when they really mean the second. This happens often enough that you could argue the incorrect usage has now become idiomatic as well.


I'd like to agree with you about the "nothing needs..." construction being incorrect, but remember that you're never allowed to accuse Shakespeare of using English poorly.

  All that glisters is not gold—
  Often have you heard that told.
Merchant of Venice, Act II Scene 7

(Also Tolkien, in the Riddle of Strider, in Fellowship. Imagine a comparison where Tolkien is the less-distinguished writer, how often does that happen.)


In my experience as a native speaker "Not everything needs" is the more common phrasing.


Sample size of 1, but as a native American English speaker, I'd use the latter phrasing of "not everything needs."

The former example seems characteristic of non-native speech, but is completely understandable also.


For me the difference is in emphasis, if I read it out loud

"Everything does not need" emphasises everything.

"Not everything needs" emphases not.

It's subtle, and probably subconscious, but it has an effect.


The discussion below is great. I think "Not everything needs" would have been better as well.

One reason I'm participating in HN is to hone my written english so I appreciate it very much you pointed this out :)

My native tongue (Finnish) is not in the Indo-European language group so I'm occasionally blind to these sort of patterns that are maybe more obvious to speakers of other Indo-European tongues.


I wouldn't read too far into it, probably just casual use of language instead of trying to push an agenda...


> I wouldn't read too far into it

I realize how my initial comment might sound bitter as the topic is quite controversial, but I'm more curious than anything.

Seeing such "unusual" placement of pronouns a few times in a short period of time made me wonder what's the deal.


Although he was historically ostensibly gender-neutral, in practice, people tended to use the pronoun of the dominant gender in an occupation. If you're talking about a generic nurse, you'd probably refer to her or she.

Which, if you're trying to avoid gender stereotypes, leads to awkward he/she, somewhat obviously switch among male and female personas, etc. The typical approach in style guides these days is use singular they. 2nd person got rid of the separate singular and plural pronoun as well so it's not unreasonable.


>Can someone elaborate the benefits/goals of using the feminine pronoun here?

Culture wars, but a much more gentle version than the current variety. There's definitely an agenda here as it wasn't done in any kind of unthinking organic fashion, especially given long-established norms (which weren't always male). Rather like replacing BC with BCE.

Some future historian will probably look at an AP style book over time and pass judgement on us all. Dunno how it will look. Maybe they'll think of the singular use of 'they' as implying that humans of this era had multiple personalities.


> If 'he' isn't (for some reason) acceptable as a gender-neutral pronoun, wouldn't 'they' be a better fit?

There are people, today, who will fault you for not using 'he' or 'she', since client is singular and 'he' & 'she' (unlike 'they') is too.


I'm one of them. Grew up with "they" as strictly plural. Had enough red ink on school essays that the lesson sank in. Every time I see "they" in a singular context it's like mentally stubbing my toe on a tree root. I have to stop, go back and re-read to be sure I didn't miss something that set up the plural context.

If you insist on being gender-neutral, you have to write "he or she" which quicky gets tiresome. Or rewrite without using pronouns, which can also be awkward.

I learned that "he" is gender-neutral if there is no other context, and that's the way I continue to write.


"They" has been an acceptable third-person non-determinate singular pronoun since at least the Canterbury Tales:

"And whoso fyndeth hym out of swich blame, They wol come up..."

That your English teachers were misinformed prescriptivists isn't the language's fault, and shouldn't be used as a ruler by which you measure the writing of others.


A couple of decades ago, I'm guessing that most English teachers/editors would have corrected, Chaucer notwithstanding. But, yes, general style these days does favor "they" as preferred third-person non-determinate singular pronoun because it's better than all the alternatives for various reasons.


They're from idaho based on the username, so they haven't caught up to chaucer yet...


This is true. I notice that practically all Californians I meet have a deep knowledge of Middle English and have studied Chaucer extensively. They even pronounce 'Chaucer' correctly, which is very rare globally.

Comments like this are why I love Hacker News.


Someone may say that "he" is gender-neutral but it's not in practice


Are you new to the internet, or the English language? This style has been done for several decades now, even before the internet, in various publications.

Why do you think it's supposed to be a "step towards reduction of sexism"?


> Why do you think it's supposed to be a "step towards reduction of sexism"?

Don't you? Then what do you think it's for?


It's a very old and common stylistic device in literature.

I find it refreshing to read, but I also doubt it plays a major role in the 'reduction of sexism'.


I can almost not read your comment, it's so downvoted by the herd. You can't question things like this, even though what you say makes total sense.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: