Hacker News new | past | comments | ask | show | jobs | submit | decasia's comments login

> there are few people who still so concisely understand the intersection of engineering leadership and those table-stakes capabilities we normally associate with SRE / DevOps.

I'm curious what else is good to read about this topic, if anything comes to your mind?


Finishing things (esp. personal projects) feels really good. Leaving them unfinished starts to feel really bothersome. I think the author describes a lot of this really eloquently (while I would also note the irony of writing a meta blog post about this topic during the time when you could have been finishing something).

Speaking as a Ruby developer, this is very cool. I could see using this for algorithmically generated music and that would be neat. (Speaking of which, I was meaning to make our noisy git repository events into some kind of algorithmic music — this would be good for that.)*

But speaking as a musician with lots of keyboard instruments — if I were trying to compose a piece of music, I'm not really sure why I would compose musical sequences with Ruby when I can play them with a MIDI keyboard, or record an acoustic instrument and loop it. I see that you can input anything you like with the Ruby DSL, but it's not a great UX compared to a piano keyboard.

I guess to put it another way — I would not use this to replace what I use Logic for. But that's just me.

* see also http://listen.hatnote.com/


I will say though - in some ways, the ruby DSL expresses intention more clearly than a midi sequence. Sometimes you really do think in terms of "now it's a C minor arpeggio, and now it's an F major arpeggio". And I like how you can compose functions that express that so clearly.


Yeah, I think Sonic Pi is really more about algorithmically created music (specifically live coding, but being mainly algorithmically-created rather than played-by-note seems pretty fundamental to that) than general composition.

But, interesting uses of tools that aren't what you would otherwise use because of unusual constraints is kind of centered in the "on topic" domain for Hacker News.


Music theorists that have poor coordination and fine motor skills.


I alluded to this in the OP, but it's mostly due to logistical constraints. While I have a keyboard I like, and could theoretically use for this, all of my instruments have been in storage for the last... while, for messy personal reasons.


I asked some classical music people once if there was a history of harmony. Someone retorted that the modern concept of harmony was an inadequate way of understanding counterpoint, and that the primary concept should be voice leading, not vertical harmony. I suppose it makes sense — that our categories of musical analysis have histories, and it can be misleading to apply them out of context (as the rest of this thread is commenting).

But I still wish someone would write a history about "what kinds of harmonies do people think sound good/melodious/interesting, and which do they consider bad/ugly/weird/useless at a given moment." Or if that history already exists, I wish I knew how to find it.


"Harmonielehre" by Dieter de la Motte is an excellent book that does exactly this. He explicitely points out that harmony must always be understood and analyzed in context. "Kontrapunkt", his book on counterpoint, follows the same principle and is just as good. I think/hope there are English translations.


Leonard Bernstein‘s Harvard lectures have some of this.

https://youtube.com/playlist?list=PLuQqHfLobLUIzDiIGrtE41KvQ...


You also need to locate such a project in place as well. There isn't just one music history, each culture has its own. They've only begun to somewhat converge in the last couple centuries, but there are at least dozens of distinct, independent musical traditions with at least a thousand years of continuous development.


A great deal has been written on this but the vast bulk requires a solid understanding of theory, Cambridge's History of Western Music Theory gives a lifetimes worth of material to study on this topic.


From a jazz musician's perspective, classical theorists tend to confuse the map with the territory. Classical theory is (at least historically) often prescriptive rather than descriptive, which tends to cause some friction when applied to music outside of that idiom.


I'm curious if this is demonstrably an optimal strategy for individual investment too... I haven't had much success getting any clear data about whether active management demonstrably produces better results.


Short answer re: investing in active managers (based on my many years listening to rationalreminder.ca) is that, if you eliminate some of the worst active managers, the average returns net of fees are the same. However, eliminating the worst managers is challenging (but not impossible) to do ex-ante. Even then, you’re only getting the same average returns as indexing, not better. Plus, you will experience a higher dispersion with active managers (greater chance of extreme negative or positive outcome), which is not desirable. Because of these two issues, It’s definitely better to pick an index fund.

Re: picking stocks yourself, the answer is also pretty cut and dry. There’s strong evidence no individual trader can expect to beat the market. Active managers can only beat the market gross of fees because they employ large teams of people to do a lot of work to gain a small edge (stuff like predicting retail sales numbers from satellite images of store parking lots).

This is my attempt at summarizing a whole field of research in a few sentences. There are many more nuisances. I highly recommend listening to the Rational Reminder podcast if you’re curious about this sort of thing. They interview a lot of academics.


There's more dimensions to an investment than average returns.

Volatility adjusted returns (or Sharpe ratio) for instance, will tell you how much returns you have per unit of risk you take. This is important because getting 10% average annual returns with 10% annual volatility is worst than getting 5% returns with 1% annual volatility. You can only compare investments at equal amount of risk.

An other factor to take into account is diversification. If you have an alternative investment to compare to your base, and it's average returns is lower than your base, but it is un correlated, then you actually get an increased volatility adjusted average returns by investing in both.

That's just two dimensions to take into account, there are many others, but overall:

- Don't compare investments based on annualized returns alone, it really doesn't make any sense.

- Don't compare investments one against an other, instead look at the addivity of one on top of another.


> getting 10% average annual returns with 10% annual volatility is worst than getting 5% returns with 1% annual volatility.

Doesn't this depend on how long you're planning on investing for, and what your criteria for selling your investments are?

If you're planning on investing for at least 10 years, and you're willing to give yourself a 2-3 year window for selling your investments once they reach a threshhold you decide on ahead of time, isn't the 10%/10% investment better?

(e.g. if retirement is 20 years away, you might consider putting your funds in that sort of investment for 10 years, with a view to moving them to something less volatile in the 5 years after that, as soon as they cross a 10% annualised return threshhold during that window.)


> isn't the 10%/10% investment better?

If you consider that "you don't know any better" and returns are normally distributed (i.e. you don't have some secret sauce nobody else knows about), then there is no dimension in which the 10/10 is better.

You can convince yourself intuitively by imagining how you would maximize each strategy. The amount of money you have is a factor of the risk you take, because if you want to do something risky you will not be able to borrow much, whereas if you want to do something safe you can easily borrow.

That is, you objective is to maximize your expected return, under the constraint of not breaking your risk limit.

Suppose you have a 10% annualized volatility risk tolerance. That's your budget.

If you invest it all in a 10% average return / 10% annual vol strategy, that's it.

Now if I propose you a 5% average return / 1% volatility strategy, you just have to go to the bank and borrow 10x your capital. You will have the same risk exposure (in dollar), but 5x the expected returns.


Is 10x borrowing even an option if we are talking retirement savings?

I don't know much about finance. I guess that at that point (you borrowed 10 times your net worth). This is no longer your investment, it's your lender's investment. They will adjust interest rate to match the riskiness of whatever you are doing, leaving you with net zero.

Borrowing money is not free.


> Is 10x borrowing even an option if we are talking retirement savings?

Of course it is, though not exactly by "borrowing money" in a "mortgage" sense. Margin trading is a way to take leverage, derivatives is another. The former is simpler but costly, the latter is cheaper and allows you much more than 10x leverage, though it requires some high school mathematical thinking.


> you just have to go to the bank and borrow 10x your capital.

"just" is doing a lot of heavy lifting in that sentence.


Until a black swan event bankrupts you because of the leverage you have taken on.


> there is no dimension in which the 10/10 is better

It's clearly better in the expected return dimension. This strategy has 10% expected return, while the other strategy has 5%.


What you're saying sounds right. But in practice, no one is going to lend me, a nobody, 5x my money. At least outside real estate, that's it's own crazy alternate reality.


5x leverage is nothing, most retail brokers will offer you much better future initial margins.


> - Don't compare investments based on annualized returns alone, it really doesn't make any sense.

>- Don't compare investments one against an other, instead look at the addivity of one on top of another.

I don’t think either of these matter to 90% of investors whose goal is to build up a nest egg for retirement which means not spending for decades in the future.

Sharpe ratios and all those “risk” adjusted calculations all involve assumptions that may or may not be true.

Comparisons of just annualized returns over long periods of time seems fine for broad market index funds, especially if you are assuming the federal US government will provide a backstop.


> Sharpe ratios and all those “risk” adjusted calculations all involve assumptions that may or may not be true.

On the contrary, these risk adjusted measures assume nothing more than a normally distributed random variable.

If you just look at annualized returns, then go ahead and invest in CDOs ETFs.

More seriously, the S&P for instance has around 20% annualized vol, which IMHO is way above what you would want for a retirement fund. I would target something closer to 10%.

> Comparisons of just annualized returns over long periods of time seems fine for broad market index funds

Do you have any sort of reasoning or is it just a gut feeling?


>On the contrary, these risk adjusted measures assume nothing more than a normally distributed random variable.

That is exactly what I am referring to. For example, from Wikipedia:

https://en.wikipedia.org/wiki/Sharpe_ratio

>However, financial assets are often not normally distributed, so that standard deviation does not capture all aspects of risk. Ponzi schemes, for example, will have a high empirical Sharpe ratio until they fail. Similarly, a fund that sells low-strike put options will have a high empirical Sharpe ratio until one of those puts is exercised, creating a large loss. In both cases, the empirical standard deviation before failure gives no real indication of the size of the risk being run.

>Do you have any sort of reasoning or is it just a gut feeling?

The reasoning is that their volatility is negligible over the long term due to political forces. Of course, it is also an assumption that could be wrong, but the mechanisms for government policy (democracy, aging demographics, and voter participation trends) seem to favor reducing purchasing power of currency rather than letting broad market equity prices stall or slide down.


> >However, financial assets are often not normally distributed, so that standard deviation does not capture all aspects of risk

Volatility not being a full measure of risk obviously does not imply that volatility should be ignored.

The former statement about returns not being normally distributed is a, trivially verifiable, factual mistake. Daily stock returns are normally distributed with a slight positive kurtosis. This remains true on any period over the last 30 years.

I am bot arguing either that the Sharpe is an all encompassing measure, some strategies have a returns distribution that is not well explained by Sharpe. I don't think it matters in this argument though.


> On the contrary, these risk adjusted measures assume nothing more than a normally distributed random variable.

The financial sector isn't yet so unrelated to reality that the price of securities is random.


I think when stating your opinion against 40 years of econometrical research, including multiple Nobel prizes in economy, you should feel enticed to explain your opinion a bit more than "no I don't think so"...


Please quote a Nobel prize (well, there's no Nobel prize in economy, but surely we understand each other) winner explaining that stock prices are, in actual reality, random variables.


> explaining that stock prices are

To be pedantic, stock returns, not prices.

As for the quotes, I encourage you to strongly think about the meaning of the work of Sharpe, Black & Scholes and Markowitz applied to non normal distributions (both Nobel prizes, we understand each other).

In particular, try to articulate the relevancy of sharpe ratios between two non normally distributed portfolios.


Something however has to be said about winning a Nobel prize then using your model to lose billions and send your company into bankruptcy.

The market is incredibly efficient at pricing many things incorrectly and Markowtiz then BSM was no exception.


> To be pedantic, stock returns, not prices.

If one is a random variable, so is the other. It's a simple change of variables. What's your point?

> As for the quotes, I encourage you to strongly think about the meaning of the work of Sharpe, Black & Scholes and Markowitz applied to non normal distributions (both Nobel prizes, we understand each other).

Could you quote the part where they say that actual, real-world stock prices (or returns, whatever) are random?


> If one is a random variable, so is the other. It's a simple change of variables. What's your point?

Not sure I follow your reasoning. Prices are positive only, and non stationary. That is very much not the same for returns. Usually prices are log normal, leading to normal (log) returns.

> Could you quote the part where they say that actual, real-world stock prices (or returns, whatever) are random?

It is not said, but rather implied. Take the Sharpe ratio for instance, it is a measure that:

1) is used to compare different assets / portfolio returns

2) rely on the 2nd moment of the returns.

The standard deviation is less relevant the further away from the normal distribution you go, so since this is a comparison metric, it can only be reasonably applied to compare normally distributed returns.

If you believe the returns are not generally normal, then you reject the use of the Sharpe ratio as a relevant measure of comparison.

I don't have a B&S reference at hand, and I did not read it since 15 years, but I'm pretty sure it assumes lognormals prices as well.


> Not sure I follow your reasoning. Prices are positive only, and non stationary. That is very much not the same for returns. Usually prices are log normal, leading to normal (log) returns.

Perhaps we should take it back to the beginning. What do you believe "random variable" means...?

> It is not said, but rather implied.

I see.


This is absolutely absurd. Economists including Nobel prize winners, including even Eugene Fama who proposed the Efficient Market Hypothesis, does not think the stock market is normally distributed.

At best, using a normal distribution is something that undergrads use as a tool to learn about the stock market and make some simplifying assumptions for pedagogical purposes, but it most certainly is not something that actual professionals or researchers in the field genuinely believe.


Come on, don't create a trial of nitpicking. I am not saying returns are a law of nature meant to teach us normality.

My point is that, for all intent and purposes, you should assume normal distribution of returns.

If you don't, you're obviously on either end of the spectrum: not knowing the subject at all, or nitpicking expertise on the internet.

The subject of the matter here is convincing someone that risk adjusted measures should be considered when comparing portfolios. This is the basic underlying modelisation that 99.99% of the finance world makes, "compare sharpes", "compare volatility adjusted returns".

I'm stating 1+1=2 and you're arguing it doesn't hold in Z/2.

> At best, using a normal distribution is something that undergrads use as a tool to learn about the stock market and make some simplifying assumptions for pedagogical purposes

Implicit normality assumptions are everywhere. I encourage you to think hardly about your model and question whether anything you do would work on non normal distributions, you will most likely find that you have millions of these assumptions in your linear combinations, sample renormalization, regressions, sharpe weighters and optimizations.

Now of course you could refine that with students, lognormals, and whatever, but this is more _refinement_ than anything.


>assume nothing more than a normally distributed random variable

But look at something like systemic risk: it’s not necessarily normally distributed. The S&P returns skew left. I’m sure there are other risk metrics that break this assumption as well.


Right, risk is often fat tailed, but unless we enter an expert discussion, for which HN is hardly a good medium, it is safe to assume 99% of strategies out there yield normally distributed returns. Non normally returned strategies are rarer and sophisticated.


The level of discussion that HN is a good medium for has absolutely no bearing or causal relationship with whether or not the actual stock market is normally distributed.

Imagine really thinking that the nature of a discussion forum can somehow influence the distribution of stock prices, as if stock prices examine comments on the Internet to determine their behavior.


I dont have a dog in this hunt but that seems like a strangely aggressive response. Perhaps the comment meant nothing more than that plaintext HN is a difficult place to start having a discussion that really requires some mathematical machinery, and therefore, since we cant throw around sigmas and integral signs here, we will make some assumptions.

Attacking the comment with sarcasm isn't in the spirit of HN even if you think that is a dumb idea.


>I dont have a dog in this hunt but that seems like a strangely aggressive response.

Well I do and as someone who has seen his other posts on this subject as well, he has a tendency to try to dismiss differing points of views on the basis that he has 20 years of experience and knows better than everyone else but can't be bothered to explain it.

Someone who has experience and wants to flaunt that experience should do so by coming up with good arguments, pointing people to good resources, and making a good effort to inform rather than pulling rank as a way to dismiss the conversation under the guise of sophistication and pretention.

I too have decades of experience working at a quant firm, and guess what... many people who post on HN have subject matter expertise and frankly I don't think many of us would agree with the idea that the stock market is normally distributed, or that you need a great deal of mathematical machinery and sophistication in order to demonstrate that fact.

Math models reality, reality does not model math. Whether or not stock prices or portfolios, even the portfolios of those on Hacker News, follow a normal distribution has nothing to do with the nature of the discussion of those portfolios.

Also, policing people's tone is also against the spirit of HN as well, but here we are. If you want to police how I speak, flag my comment and/or downvote it.


    > decades of experience working at a quant firm
Has quant finance existed for "decades"?


The meaning of quantitative changes a bit with time and context.

I would say the more "bayesian / sell side / derivative pricing" kind of meaning exists since the late 70s, the more "frequentist / buy side / let's hire 100 physics PhDs" meaning came prominent in the early 2000s. (as a general feeling).


>> I dont have a dog in this hunt

> Well I do

Well then what's your stake here?

> he has a tendency to try to dismiss differing points of views on the basis that he has 20 years of experience

I surely will concede I have this tendency, now you have to keep the context in mind. You are on an internet forum focused on CS, and emerges a comment thread on personal investments. The very subject of this thread is whether it makes sense to consider risk adjusted returns or just any kind of returns for your investments.

My argument is based on the fact that risk adjusted returns should be used, and you should assume normal distribution. I am not saying this is a law of nature, but rather that this is a fine and widely used assumption for both practitioners and academics, which allows the argument and explanation to go further without entering an experts debate (like you are trying to start).

So I stand by what I said: for all intents of this discussion, assuming normal distribution should be a given. If you want to dance around it and demonstrate that a students distribution or whatnot is a better fit, go ahead. I think this is more armful than helpful here.

> try to dismiss differing points of views on the basis that he has 20 years of experience

I think this is important on the contrary. What is lost on a forum like HN is the context of people answering comments. When someone comments "I don't think risk adjusted returns are important", it makes a hell lot of a difference if it's just the opinion of a random guy, or someone with actual experience.

Now while it takes 1 sentence to wrongfully dismiss a scientific fact, it can take 100 pages of an expert to prove that it's true. Look at a proof that 1+1=2.

That is where credentials are important IMHO. Some debate tengents are not interesting in a discussion, and will only lead to an expert explanation serving no purpose other than confusing a reader, and making the expert proud of himself. In these situations, just stopping the tengent is the best reaction IMHO.

So I apologize if you take my comments as dismissing, but try to assume good intent. When someone asks why you should use risk adjusted returns to compare investments, I think the saner thing to do is to tell him to assume normal distribution, because that's the far more likely scenario, most of the research do take this overall assumption, and you can proceed to the demonstration that makes sense, which I showcased in my previous comment about 10% returns on 10% annual vol versus 5% returns on 1% annual vol.

To re take the example I posted above, when the discussion is about 1+1=2, I don't think you're doing any good contradicting that it doesn't hold on Z/2.

Assuming normal distribution of returns is a pretty standard base for comparing investments. It is a base shared by many models and metrics. Sharpes don't make a lot of sense on non normal distributions, mean variance optimization either.


>this is a fine and widely used assumption

Modeling (and possibly economics, especially) is rife with simplifying assumptions that break down in practice. You can find many economists who think modeling individuals as rational agents is a "fine and widely used assumption" while also finding many economists and psychologists showing where this assumption can get you into trouble. There is a big difference between "this assumption is made because it reflects reality" and "this assumption is made because it makes my life as an economic modeler not suck." The latter is still fine, but only if you're upfront about its limitations.


>it is safe to assume 99% of strategies out there yield normally distributed returns.

I don’t know that I agree. If the market as a whole doesn’t have normally distributed risk, it implies even the simplest strategy of buying SPY and holding will also not have normally distributed risk.



> a higher dispersion with active managers (greater chance of extreme negative or positive outcome), which is not desirable.

some people prefer the chance to win the lottery rather than get a steady income stream.


Except this is a myth. You will not win the lottery without taking crazy amounts of risk. The active managers who do beat a major index for a long, long time almost do not exist in retail space, and they beat the market by a tiny amount (~1%). In my era Legg Mason was the most famous, but even they fell too.


How does that explain Warren Buffet’s spectacular success?


Buffet isn’t a passive investor. Berkshire Hathaway have often taken a very active role in the running of their acquisitions - appointing managers, setting strategy, merging/splitting off subsidiaries, etc. This is as much managing as investing. If Buffet was a pure stock picker, then he would be an interesting case in the active vs passive investment debate.


When you own one billionth of a company you are just along for the ride. When you buy a 10% or more stake you can influence the running of the company. Another aspect is he has offered liquidity to distressed companies like GE and got richly rewarded for it with favorable terms.


Someone with Buffet's success should exist by random chance. (Flip a fair coin enough times and it will come up heads 20 times in a row.)

Also, some fraction of Buffet's success comes from deals that the rest of us don't have access to.


Yeah, I have tried to simulate this several times under different conditions. Given zero sum game and random odds, there is always going to be small percentage who have a lot and most will have below what they started with. It is easy to explain as well, if you for example start with $1000 and you have 50% odds of winning 10% every time. If you win and lose 50% you are going to be below what you started.

If you always win after lose and lose after win, then it would go like this:

1. $1000

2. $1100

3. $990

4. $1089

And so on... After 100 turns you would have only around 600 - 700.

But it's a zero sum right. Where does the 300 - 400 go? It goes exponentially to select few who by random chance have more wins than losses.

In fact the longer it goes on, the higher odds of there being outlier with a lot - you might expect that everyone would converge around $1000, but that is not the case.

I did an example run with 10 000 investors, each doing 1000 trades, each trade they bet 10% of their portfolio, with 50% odds of winning.

First investor had 562 wins and 438 losses, with $1,666,061.

Median investor had only $7 left with 500 wins and 500 losses.

Top 10th percentile investor had $364 with 520 wins and 480 losses.

So interestingly even an investor that had 40 wins more than losses, lost 2/3 of portfolio.


Stock market is not zero sum.


Yes, but in terms of beating the market it should be.


> How does that explain Warren Buffet’s spectacular success?

Buffett buys “cheap, safe, high-quality stocks” with leveraged “financed partly using insurance float with a low financing rate” [1]. TL; DR He’s doing private equity with discipline.

[1] https://www.aqr.com/Insights/Research/Journal-Article/Buffet...


> How does that explain Warren Buffet’s spectacular success?

1. Buffett has been underperforming the S&P 500 for about twenty years now:

* https://www.linkedin.com/pulse/warren-buffett-has-underperfo...

* https://news.ycombinator.com/item?id=37827101

For most people who are saving for retirement between the ages of (say) 30 to 65, that's most of their investing lifetime, and such underperform could radically effect the life they can live once they start working. Do you want risk your proverbial Golden Years simply because you chose not to take the market average returns?

2. While Buffett is a better-than-average investor (and certainly better than me), the main reason why we know him is because he's so rich, but as Morgan Housel notes, the vast majority of that wealth has come from compounding:

> As I write this Warren Buffett’s net worth is $84.5 billion. Of that, $84.2 billion was accumulated after his 50th birthday. $81.5 billion came after he qualified for Social Security, in his mid-60s. Warren Buffett is a phenomenal investor. But you miss a key point if you attach all of his success to investing acumen. The real key to his success is that he’s been a phenomenal investor for three quarters of a century. Had he started investing in his 30s and retired in his 60s, few people would have ever heard of him. Consider a little thought experiment. Buffett began serious investing when he was 10 years old. By the time he was 30 he had a net worth of $1 million, or $9.3 million adjusted for inflation.[16] What if he was a more normal person, spending his teens and 20s exploring the world and finding his passion, and by age 30 his net worth was, say, $25,000? And let’s say he still went on to earn the extraordinary annual investment returns he’s been able to generate (22% annually), but quit investing and retired at age 60 to play golf and spend time with his grandkids. What would a rough estimate of his net worth be today? Not $84.5 billion. $11.9 million. 99.9% less than his actual net worth. Effectively all of Warren Buffett’s financial success can be tied to the financial base he built in his pubescent years and the longevity he maintained in his geriatric years. His skill is investing, but his secret is time. That’s how compounding works. Think of this another way. Buffett is the richest investor of all time. But he’s not actually the greatest—at least not when measured by average annual returns.

* https://www.goodreads.com/quotes/10551666-more-than-2-000-bo...


To be fair to him, he does say time and time again "Invest in an S&P Index Fund"


> There’s strong evidence no individual trader can expect to beat the market.

I don't understand that. If you just bought Apple instead of SPY 20 years ago wouldn't you be doing great?


The key is that for every Apple, there are a ton of companies we don’t even remember the names of that went out of business or otherwise did not beat the SP500.

Put another way - if you can reliably pick the next Apple before anyone else, you should go work in finance and make tons of money.


> Put another way - if you can reliably pick the next Apple before anyone else

Problem is that it might take years to verify that.

> The key is that

That doesn't change the fact that there are plenty (in absolute numbers) of individual investors who consistently beat the market. Whether that's because of luck or something else is rather hard to tell.


> That doesn't change the fact that there are plenty (in absolute numbers) of individual investors who consistently beat the market. Whether that's because of luck or something else is rather hard to tell.

It's actually not very hard to tell; if it was because of something other than luck, you'd expect that beating the market in the past would have some predictive value of their ability to beat the market in the future.


Individual traders beat the market all the time, it’s not impossible. But you can’t expect to do it reliably, because in practice it’s essentially gambling, unless you’re Warren Buffett, or those firms that utilize sophisticated quantitative or algorithmic trading.

So for all intents and purposes, the takeaway for regular investors should be that they cannot expect to beat the market (but they can gamble on it if they like).


Lots of people "play the market" as a mostly total game of chance. They might just as well join a giant pool that tries to guess the ratio of alphabetic characters within each morning's top headline of their favorite newspaper.

Warren Buffet buys the newspaper and has significant control of the editor. That's not the same game at all.

There's a lot of talk here about active fund management. Active ownership is playing on a completely different level.


You do know there are thousands of stocks right. how many people dump their entire savings into one stock. 20 years ago you wouldn't have known apple was going on to do so well. If people did know it would have been bid up in price at the time


Which still means that SOME individual investors will inevitably beat the market.


Some will, but there is no reliable way to tell which one in advance.


And a lot of the Apple run-up happened relatively late in the game. Don't get me wrong. Apple--and what I was able to do with the money--was good to me. But so was a late 2010s Microsoft purchase and I don't think a lot of people are highlighting Microsoft as a stock they missed out on during the last 10 years.


It is on average, not from cherry picked examples.


That claim is not phrased like that, so why would we interpret that way?

> average

So what? It's like saying that since an average person can't run a marathon it wouldn't make sense for any individual to even try it. How does that make sense?

> cherry picked

If we agree that 50% of all investors can't beat the market, what proportion can? 1%, 10%, 30%? Because there is a massive difference.

How do we even define that group? Is it any random person buying random stocks with pocket change? Is it above a certain portfolio size? etc.


"Average" is a bit misleading word when it comes to the market.

If you're a top investing expert, you do things carefully in the right way, and you don't make mistakes, you can expect average performance. Because the market primarily consists of experts like you.

Of course, investing is a random process, and you often beat the market by being lucky. But luck doesn't last indefinitely.

There are basically two ways to beat the market consistently. One is trading based on information not available to the rest of the market. This is sometimes banned, because it makes the market less fair and less efficient. It can also be a crime. The other is finding a market that's small enough or obscure enough that it's not interesting to the professionals.

But there is no investing stat that allows you to beat the market. Life is not an RPG.


You’re missing the word ”expect” in the original claim.

You can beat the house at blackjack, but you can’t reliably expect to do it.


Investment is hardly a zero sum game. If it were nobody would make anything buy investing passively either. So how is that a reasonable analogy?

> you can’t reliably expect to do it.

Sure, I can't. But assuming that it's not entirely random chance some proportion of people certainly can.


> Investment is hardly a zero sum game. If it were nobody would make anything buy investing passively either. So how is that a reasonable analogy?

I think you’re reading too much into the analogy, which is maybe my fault for using an analogy. The point was just that it’s not that you can’t win, just that you very likely don’t have an edge - not because it’s mathematically impossible like in blackjack with a shoe that’s continuously shuffled, but because it’s so difficult.

> Sure, I can't. But assuming that it's not entirely random chance some proportion of people certainly can.

Yes, but the bar is very high.


And if you put your house on 26-black and it came up you'd be doing great too.

Take 100 people randomly throwing darts at the companies on the SPY, and a fair few will do better than the SPY overall. Doesn't mean they can expect to beat the market


Yeah. And 20 years ago, there was no iPhone and an only somewhat interesting MP3 player compared to other brands. I did OK with Apple but not suggesting it was much other than luck.


> picking stocks yourself, the answer is also pretty cut and dry. There’s strong evidence no individual trader can expect to beat the market.

Is there? It would make sense if an average individual trader can't expect to beat the market. Claiming that there are no individual investors who did/can do that over a reasonably long period is both objectively false and rather absurd.


Read the GP carefully.Expect to beat is very different than beat. You don't expect to beat the casino in roulette, but some people will luck out. That doesn't mean they could expect to win in advance: They should expect a small loss, depending on the table, and be surprised when luck smiles upon them.


> You don't expect to beat the casino in roulette,

Do you believe that investment is entirely random and there is absolutely no skill involved?

Because if not, that's a nonsensical analogy. You should use a a both both luck and skill based game like poker (probably not the casino variety, though) etc.

Otherwise if you can reasonably expect to beat 50% of all "players" (of course it takes much more time to verify that in the market) then you can expect to make more than the average.


The problem is that any active trading strategies now need to beat the market by the cost of a fund manger, the cost of their research, the cost of regular trades, and the cost of short-term capital gains taxes on those trades.

These add up significantly. Instead of having to beat the market at all, you have to beat it by an extra half of a percent or more every year. And you have to do it year after year after year.

All the evidence shows that actively-managed funds are a weighted (against you) coin flip. Less than half will beat the market in a given year. And the results from any given year are independent of the next.


Yeah managed funds sucks big times. They rip people off with fees, quite some have insane performance fees and they just don't beat the market.

Then I suspect that even with all the supervision in place, quite some manage to also do Hollywood accounting.

Not to mention the friend of the cousin of the fund manager's niece who happened to buy x shares of y or options before, shocker, the fund invested in y.

We know these people cheat. If they were so good they wouldn't need to leech on fees.

I live in a tiny country where lots of fund are managed (only second to the US). I know the drill. Most of them by very far are about suckering people's money in, no matter what the fund is about.

Creat 16 funds, after four years show the prospectus of the one fund that performed best. Rinse and repeat.

Actively managed funds are a scam.

Also depending on where you buy it, anywhere from zero (good) to 1% entrance and exit fees.

"Scam" is not a strong enough word.


> Do you believe that investment is entirely random and there is absolutely no skill involved?

The skill involved is more just "best practices" that let you match the market: Buy-and-hold, diversify, basically, do what the index funds do and you will be roughly +0 to the market. Beyond that, it's a totally random distribution that adds between -X and +X which allows some participants to beat the market and causes some to underperform. You can't tell beforehand which participants will beat the market, even having full knowledge of their strategies and skill. If you think you can, please tell me which active funds will beat the market in the next 10 years based on their skills. I'll invest in them.


> You can't tell beforehand which participants will beat the market, even having full knowledge of their strategies and skill

I never implied that I can. That fact doesn't prove that it's somehow fundamentally impossible to do that. The problem is that it's impossible to tell if you "strategy" is working until a significant amount of time passes and by that point the markets conditions might have changed to such an extent that you don't longer have an edge (add to that the fact that it's hardly possible to determine what part of your success was luck/skill). So there is always a huge amount of uncertainty.

Albeit if we look back by ~10-15 years it's rather obvious that it was possible to beat the market by a very significant e.g. there were clear rational reasons to believe that Nvidia would do better than its competitors like AMD or Intel and that there would be significant growth in GPU compute/ML/AI (of course accurately estimating the extent and exact timing but that wasn't necessary at all to get above market return) same applies to many companies in adjacent and unrelated sectors. Was I or the overwhelming majority of investors capable of realizing that and more importantly acting on it? Certainly not. But looking back it obviously wasn't random.

The efficient-market hypothesis is clearly false, at least in the short to medium term. That in no way means that most investors are even remotely capable of utilizing this fact.


It must have been nice in the early 2010s to be so smart to predict AI would be a huge hit (after a couple previous AI winters) and that GPUs would be the key and that NVIDIA specifically would reap the benefits. But I'm sure you're smarter than me and a lot of other people. And AMD also did pretty well during much of that same period although I sadly sold my modest holdings after they went nowhere for years after spiking with some adoption by the big server makers. It would probably have made more sense to bet on Intel during that period.


Exactly. The point is that the only way you can know that a particular "strategy" was market-beating is by looking back after the fact. Just like you can only know who is a good coin flipper after running 10 trials and looking back at who flipped heads 10 times. And the strategies will be similarly repeatable.


We expect some individual traders to beat the market (and some to do much worse than the marker); that's variance. But each individual trader should not expect to beat the market, because they don't know if they're one of the lucky ones.


> But each individual trader should not expect to beat the market

In aggregate sure. But unless we believe that it's entirely random some individual investors can still certainly expect to beat the market, they just can't verify that in advance.


You’re being pedantic in all the wrong ways.

I offer you a bet. We flip a perfectly fair coin. On every heads you gain 10% on top of your bet. On every tails you lose 10%.

It is fair to say that after 100 flips you may profit. If one million people play this game, someone almost certainly will. But you can expect to lose money on this game. By the end, the average person will have about 60% of their original holdings (0.9^50 * 1.1^50).

In this game it’s possible for winners to exist. It’s not even uncommon! You only have to get at least 53 out of 100 flips as heads. Unfortunately there’s also no function that lets you determine a winner in advance, and the longer you play this game the greater the expected loss.

All of the available evidence shows that publicly-available actively-managed funds are essentially playing this game. As expected, many have incredible winning streaks… right until they don’t.

Yes, Ren Tech’s Medallion Fund exists. But you can’t contribute to it; they don’t want your money. Because that requires scaling market inefficiencies and that in and of itself is an intractable problem. Novel strategies ripe for profit don’t have unlimited capacity. They rapidly exhaust alpha.


> You’re being pedantic in all the wrong ways.

No, I simply disagree with the whole premise, at least to a limited extent.

> All of the available evidence shows that publicly-available actively-managed funds are essentially playing this game

Yeah that's true, I was mostly talking about individual investors and/or non public funds.


That’s not what “expect” means in statistics. If we’re rolling 100-sided dice (each person rolls once), no person should expect to roll a 1, even though 1% of people will in practice roll a 1. Likewise, no one should expect to beat the market, even though many will in practice.


> Likewise, no one should expect to beat the market, even though many will in practice.

My only point is only that not every investor is rolling the same dice. It's just that it is effectively impossible to every verify whether you were rolling a 90-sided dice or a 100-sided one. It's rather clear that at least in the short to medium term (e.g 2-3 years) the stock is not even remotely perfectly efficient (that doesn't mean that the overwhelming majority of investors are somehow capable of utilizing that fact or that a significant proportion of those that did seemingly manage to do that weren't just lucky)


If you don't know what dice you have, then the reasonable way to model that is a random choice of dice.

And doing that gives you the same expectations as everyone using the same dice.


The common refrain is that "time in the market always beats timing the market".

The implicit assumption in that refrain is that, despite periodic dips, the U.S. stock market always goes up over time. This has been true since the Great Depression (see graph of S&P 500 since 1929)

https://www.officialdata.org/us/stocks/s-p-500/1929

The implicit assumption behind that is that the American economy always invents a way to grow. Buffet famously said, "never bet against America".

For as long as these assumptions match reality, it's likely that passive management will continue to succeed.


> American economy always invents a way to grow

There are various macroeconomic models which attempt to explain the factors of growth, for example the Solow growth model. In this model technological advancement is only one of three factors. The others are the savings rate and the population growth rate.

According to this model, you may not be able to innovate your way to growth if one or both of the other factors are contrary to growth. This may sound academic but there are concrete examples in the last twenty years of countries that have not grown because of a stagnant or shrinking working age population, e.g. Japan and Italy.

This has no bearing on the passive vs active debate, as I'm fairly confident that passive investing will always be the better strategy for a retail investor regardless of the growth potential of an economy.

It's just in a country with unfavorable macroeconomic conditions, passive investing may be the way to minimize losses rather than maximize gains.

The assumption of continued growth will probably hold true for the American economy through the end of the century at least, so for everyone here investing in US equities it is academic. But we can try to decompose an economy into factors and use those to check whether we expect growth to occur at all.


That is too strong of a condition. The economy doesn't need to grow for passive investing to work.

Even when the economy is flat, passive management works. As long as companies are economically productive, capitalism will hand over a chunk of the profits to the owners of the capital.

Of course, growth increases the size of that chunk year over year, but capitalism doesn't stop when growth stops.

Active investing is when you are looking to exceed this passive margin by timing your trades well. Active investing requires changes in productivity (such as growth).


Returns from stock investing come from increasing stock prices.

Stock prices increase when earnings of the company grow.

In other words: when the economy grows.

You can argue that Amazon and Apple and Google and Facebook etc. will grow earnings even if the overall economy is flat or shrinks but I don't see how that would apply to passive investing i.e. investing in S&P 500 i.e. investing in 500 largest US companies.

S&P 500 is U.S. economy and they all are sensitive to overall economic situation. If people have less money, they buy less stuff. Amazon makes less money, their stock goes down. Apple sells less iPhones, their stock goes down. All other companies make less money, they spend less on advertising, Google and Facebook make less money.

I don't see a scenario where overall U.S. (or world) economy declines and S&P 500 doesn't decline.

In fact, declining economy is an argument for active investing. Even when overall economy declines, among 6000 companies listed on stock market there will be some that will be growing and if you invest in them, you'll make money.


For me returns on personal investment in stocks, funds and ETF:s consist largely of dividends.


> Returns from stock investing come from increasing stock prices.

There are other ways to make returns. Return from stock comes mainly from increasing stock prices and from dividends. But fundamentally, it comes from profits.

> Stock prices increase when earnings of the company grow.

There are many reasons stock prices increase. But whether it does or doesn't isn't really relevant.

When a company makes a profit, either:

* the profit is reinvested, the value of the company grows and the stock price grows, making a return for the passive investor

* the profit is returned as dividends, making the passive investor a return as well.

No growth needed for the individual companies either. As long as they are profitable, they make a steady return for the passive investor.

Thought experiment: imagine a company which is going to make 1 dollar of profit per year for all eternity, which it returns as dividends. For an investor with a discount rate of 95%, that company is worth 20 dollars. Say he buys the company for 20 dollars. After 10 years, that company is still worth 20 dollars, as eternity is still eternity, but the passive investor owning the company has made 10 dollars from the company.

As you can see, the passive investor made a return, despite the company only being profitable, but not growing nor shrinking.

You will make a return on your investment when your investment makes a profit, that is capitalism. Whether the profit is increasing, decreasing, flat or going in circles does not really matter, as long as it is a profit and not a loss.


This is all correct, but missing the higher order. Most investors will not take out the dividends, but reinvest them. A few might sell, because they are in retirement. But assuming that the retired people make up a small part of investors, profit is reinvested. Further, people invest a percentage of their income for retirement. All that means that work income and dividends make the stock prices go up and retirement makes the stock prices go down. You could say that retired people consume and help companies make profit, but it is actually worse for stock prices than investment, because the consumption requires companies to sell products and services that come with cost.


> time in the market always beats timing the market > The implicit assumption in that refrain is that,

Only if you were fine with waiting 50-100 years. The market in 1950 was more or less at the same level in real terms as in 1906, of course dividends were way higher back in those days. If we take that into:

e.g. if you invested 200$ in S&P 500 in 1906 adjusted by inflation in 1950 you would have had ~$1570 in 1950. Which is an average annual return of ~4.7% which is not terrible but you would have made approximately the same by buying high grade corporate bonds just with way less volatility.


And the assumption that the stock market will go up from now is itself a form of timing the market. It assumes that now is the best buying opportunity in the whole future. I don't like that refrain.


> the American economy always invents a way to grow

I suspect American economy grows slower than stock market. Last 25 years its about printing debt and money supply.


It’s also just capitalism and fiat currency - in a world where the money supply has to inflate, the prices in the market have to go along with it.


> The implicit assumption behind that is that the American economy always invents a way to grow. Buffet famously said, "never bet against America".

Or you invest in a total world market fund for better diversification.

Diversification would have helped anyone in Japan(-only) in 1990, and anyone in the US(-only) in the 2000s. It's a very easy strategy nowadays:

* https://investor.vanguard.com/investment-products/etfs/profi...

* https://www.vanguardinvestor.co.uk/investments/vanguard-ftse...

* https://www.vanguard.ca/en/advisor/products/products-group/e...


It is the opposite. Market timing does not work reliably. Active management produces worse results on the long run. no individual trader or active manager can consistently beat the market. however active fonds may have periods (even several years) where they out perform.

for private investors buy-and-hold of highly distributed ETFs is the best way to do it. The easiest way to get started is a one ETF portfolio like e. g. Vanguard FTSE All World or SPDR MSCI ACWI IMI. They perform internal rebalancing automatically and you virtually have nothing to do. buy them and don't look at them for the next 20 years.


Noting that it is possible to beat market, with strategies / algorithms that are generally non-public. For example medallion fund, see https://posts.voronoiapp.com/markets/Jim-Simons-Medallion-Fu... . Note that these crazy performance stats are after the steep fixed + performance fees.


Strangely, the other funds operated by the same company and actually open to outside investors, have not performed as well. It is unexplained exactly why.


Acquired.fm has a great episode on RenTec medallion fund vs their institutional funds.

‘David: The way that some folks we talked to described the difference between the institutional funds and Medallion to us is that Medallion’s average hold time for their trades and positions is (call it) a day, maybe a day-and-a-half. Whereas the average hold time for the institutional funds positions is a couple of months.’


From some of their legal settlements it seems a not insignificant part of their advantage is dreaming up obscure illegal tax dodges on short term capital gains that are later revealed as such to keep more funds invested.

https://www.moomoo.com/news/post/5891516/the-biggest-tax-eva...


Seems pretty obvious that their keeping their best returning strategies for employees rather than outside investors?


It’s possible only in the sense that it is possible to flip a coin heads 10 times in a row. One out of 1024 should do it. But you don’t know which one will until the experiment is over and you look back at the results.


Yep. Statistically, there must be some outlier. Always. And… good luck having the data they use to trade and the money to just enter into the markets they participate in.


But why stress about beating the market? Just be the market with an ETF that tracks the S&P 500 index. Literally, setup auto invest from your paycheck. Go to sleep (Rip van Winkel style). Wake up 40 years later and retire comfortably.

Look at total returns over the last 40 years on the most popular indices in the world. S&P 500 crushes them all. I see a lot of "Internet advice" recommending various MSCI world indices. They are all much worse than the S&P 500.


Picking the S&P500 over a world index because you think it will outperform, has the same problem as picking individual stocks over an index. You can't actually know which will outperform in the future.


Or, where in the world do you need to spend your money?

I live in the UK: if I buy the S&P500 over the FTSE100 (or even more so the 250, the next 250 largest companies which are typically more UK-market-oriented) I'm making a US-weighted bet. But maybe I think I'll move there, and should have that exposure. Or maybe I spend a lot of money all over the world and want a more global exposure overall.

I think at least vast majority index is right for basically everybody, but you do still need to think about which index/indices are most applicable to your situation/intentions.


Over 40% of revenues from S&P 500 companies come from overseas. That is world enough for me.


You don't need to be the best, just do well.


Has the World Index ever outperformed the SP500 over a 40-50 year span?


World indexes are normally somewhat less volatile (they’re typically _much bigger_; MSCI World is 1400 companies, and MSCI ACWI nearly 3000), which may be a useful attribute, depending on what you’re going for.

(Conversely, there are smaller indexes which tend to beat the S&P500, like the NASDAQ100, but there’s a volatility cost.)


Sometimes I wonder whether ETFs that track top valuation will lead to some weird stickyness and overvaluation in say, S&P500.


I have the same thoughts. Eventually there will be a lot of money to be made breaking the s&p 500.


Can you explain the "breaking" trade? And why haven't we seen more written about it?


Think of when George Soros broke The Bank of England for an example of the type of trade.

There is a lot of demand for S&P 500 index , but that demand isn’t exactly tied to the fundamentals of the index, and the price isn’t tied to value of the underlying companies, it’s tied to demand of people looking to save money for retirement or a place to store a nest egg. This is an opportunity for price discovery to get things wrong and eventually the market should correct that.

https://www.investopedia.com/ask/answers/08/george-soros-ban...


I don’t get it. Soros was able to break the pound because the UK government was committed to maintaining an artificial price. That’s nothing like an ETF or mutual fund of the S&P 500, which probably has one of the most efficient (relative to the capital involved) market price discovery mechanisms in history.


Not sure what they mean specifically but you've probably seen a lot written about it, in terms of BRICS, the petrodollar, ARM in China, subsidies on electric cars, and so on.

Personally I try to avoid investing in the US for political reasons, besides the wishful expectation that the empire could fall within my lifetime and hence be a not so good investment.


It's sort of self evident - if you are freakishly capable of spotting mispriced securities in a market full of smart hard working people who are paying attention, you can do better than average. If you aren't freakishly capable... you cant.

It's sort of like "does playing pro golf make sense?".


There’s an interesting corollary to this. There’s some evidence that low-volatility strategies can outperform, at least when not using leverage. My working theory is that it is due to an overconfidence bias. People who actively trade assume they can pick better stocks, or else they wouldn’t trade. This manifested in more volume in high-beta stocks, leaving low beta stocks undervalued.


They’re only mispriced until they’re not though, or they’re priced well until they’re suddenly mispriced. That is the say the market is an evolving system varying on the time axis - that things are mispriced assumes that time isn’t rolling along and new events don’t happen and new information doesn’t arrive. Everything’s price today is just a guesstimate until tomorrow’s guesstimate following some new data. Granted it’s not like the past where whole companies were sitting there underappreciated because of a lack of analytics, but at the same time coming out of covid companies like Rolls Royce (makes aircraft engines) had their prices crash completely, then were demonstrably “mispriced” for ages and are still recovering now air travel is back to 2019 levels. But the price wasn’t mispriced when the planes weren’t flying, just cheap to those who believed covid would get sorted eventually and the debt RR took on to survive would get repaid.


The fact they are mispriced then not mispriced at a later time is almost the entirety of the reason one has the potential to make better than average returns.


There’s the famous bet Warren Buffett won against an actively managed fund: https://www.investopedia.com/articles/investing/030916/buffe...


There's strong data supporting passive investment. I've found the stats in Andrew Hallam's book, "Balance: How to Invest and Spend for Happiness, Health, and Wealth" (https://www.amazon.ca/Balance-Invest-Happiness-Health-Wealth...) quite eye-opening.


The catch 22 for active management is that if they are actually good then they would just use their strategies to manage their own money.


They do. But if you offer the service to other people, you get a lot more money to play with (meaning you can do more or different things than you could with less) and get to charge performance fees etc. in addition to your own capital gains.

Really, you could say it about absolutely any job, it's just a bit more direct with managing money. 'If you were any good at writing software you would just sell your own SaaS', etc.


That's the common claim, but if you actually look at the successful funds that beat market year after year, their public fund is always the low yield, experimental strategies while the internal funds demolish the market. The reality is that most lucrative strategies have a yield cap and people who find them quickly surpass the cap so they just keep the strategies to themselves.


That doesn't really invalidate my point though: the extra capital gives the option, and the fees.


They probably do. They just make it their day job by selling their services to others as well.


The catch 22 for this assumption is that they want to be richer than their own money would allow


Yes, read "The little book of common sense investing" by Bogle.


The general wisdom is that it’s basically impossible for most people to tell the good fund managers from the bad/mediocre ones.

Except Warren buffet. A lot of people went with Berkshire Hathaway and did very well.


> Except Warren buffet. A lot of people went with Berkshire Hathaway and did very well.

Buffett has been underperforming the S&P 500 for about twenty years now:

* https://www.linkedin.com/pulse/warren-buffett-has-underperfo...

* https://news.ycombinator.com/item?id=37827101


Take a look at the automated systems that have resided on Collective2 for more than 2 years.

https://collective2.com/grid

There are only a couple that has a large history of trades, fairly even equity curves, older than two years, and small(ish) drawdowns (< 30%).

In other words, it's difficult but possible.


On average yes. An index tracker will get the average market return ignoring fees, so its returns will be those of an average active fund.

However, the active find will charge higher fees.


Active does better much better. If you know how the price moves you can easily beat the market.


Source?


I think it kind of goes both ways. There are times when you absolutely want lambda functions instead of cgi-bin scripts. But conversely - there are times when you absolutely want cgi-bin instead of lambda. (For interfacing with other linux services or packages, for example). The two tools don't always substitute for each other.


I mean, that is the difference between using something because it's the flavor of the day and using something because it's the best tool for the job.


I think the spirit of this article is correct, although some of the digs at modern web tech and SPAs seem to be beside the point.

I used to have a "mildly dynamic website." It was a $5 digital ocean box. It ran nginx with php-fpm, mostly so it could have a Wordpress install in a subdirectory, and it had a unicorn setup for an experimental Rails app somewhere in there.

Given that environment, the "mildly dynamic website" experience that TFA talks about was absolutely true. If I wanted a simple script to accept form input, or some little tiny dynamic experimental website, I could trivially deploy it. I could write PHP (ugh) or whatever other backend service I felt like writing. I ported the Rails app to golang after a while. It was fun. It made for a low cost of entry for experimental, hackish things. It's a nice workshop if you have it.

The thing is — if you are running this setup on your own linux virtual machine — it requires endless system maintenance. Otherwise all the PHP stuff becomes vulnerable to random hacks. And the base OS needs endless security updates. And maybe you want backups, because you got lazy about maintaining your ansible scripts for system setup. And the price of the $5 virtual linux box tends to go up over the years. And the "personal website" model of the web has kind of declined (not that it's altogether dead, just marginalized by twitter/facebook).

So I got exhausted by having to maintain the environment (I already do enough system maintenance at work) and decided to switch to static HTML sites on S3. You can't hack it anymore. But so far — I can live with it.


>and decided to switch to static HTML sites on S3

it's really hard to overstate how great s3 sites are with cloudfront in front of them. mine costs me <$1/month, i have essentially no concerns about security, ddos, maintenance, anything. if i want to update it, i push some different files to s3. the backups are the other copy of the data on my local machine that i pushed to it.

the added complexity for any level of dynamic-ness is really just not worth it, unless i'm going to go to the trouble of making a full-on app that needs a revenue model.


CloudFlare Pages are another great product for zero maintenance static sites, and they can be made mildly (or heavily, if you want) dynamic with CloudFlare Workers.


Second Cloudflare. And it's super easy to hook up to domains registered via Cloudflare too.


Similarly, Github Pages are great. You get rudimentary dynamic support by using a static site generator if you need it.


Shared hosting is probably better, and AFAIK more common, for the mildly dynamic website. The host handles a lot of the admin tasks like OS updates that you have to handle yourself with a VPS.


NearlyFreeSpeech.NET is good for this. Their main tier - "production" sites - are very inexpensive and the admins take care of OS and server-software updates. They have another tier - "non-production" sites - that are even cheaper and can be perfectly sufficient for a personal homepage. The admins maintain these servers as well but they might do beta testing on them.

The environment is fully hackable and has PHP, SSH, SFTP, MariaDB, dominant languages like Perl and Python, obscure languages like Haskell and Lisp, etc etc.


+1 for NearlyFreeSpeech! I’ve used them for years paying only 40-45 cents a month. I love that I can just ssh in and mess around. My site is mostly static but recently I wanted to add a private section for specific family and friends. So I implemented OAuth 2.0 login with 2 php files and an .htaccess rule.


Love nearly free speech.

Although I wonder how often your family actually uses it.


And even more obscure languages like Forth and Octave. The only thing to watch out for is that they run FreeBSD (instead of a more "normal" distribution) so if you're used to Linux-as-seen-on-Debian-or-Ubuntu there are a few things that are different (but so cozy).


For what it's worth, FreeBSD is actually it's own thing and not Linux at all. It's descended from Berkeley Unix and has no code in common with Linux or GNU (though it can still run software that's cross-compatible).


Correct - but if you're normally interacting with Linux boxes on the command line you're probably not going to be too far from home since the _programs_ behave mostly-the-same.


Yes, though I guess this means we're in the era where "Linux" is more widely understandable than "Unix".


For some reason I've never heard of it before. Looks like a good value as long as you avoid storage. Even shows D as a supported language!


Does it support dot net?


This. For people looking for hosting as a service that don’t need scale (and even for some people who think they do) shared hosting is often the best low-admin + low-cost solution. Buuut you can’t brag about your cloud setup.


When I was checking out the grav flat-file CMS, they had a recommendation for PaaS (php as a service)

https://learn.getgrav.org/16/webservers-hosting


As much as i love every P standing for PHP, "PaaS" usually means Platform as a Service


And then you forgot to add "I had to stick it behind cloudflare because someone decided to DDOS it and send me 500GB/s traffic for no apparent reason at all.

The early web was the wild west. The modern web has turned into small fry trying to hide in the shadows of megalodons so they don't get ate by a goliath.


You don't have to go as far as DDOS.

Blogpost mentions comment section, like it something you can "just do" - no one wants to do comment sections in 2024 on their own. Amount of crap one will get in case they start to have mildly popular post is insane.


> it requires endless system maintenance. Otherwise all the PHP stuff becomes vulnerable to random hacks

How so? I've seen PHP websites & apps run for 10+ years in production without updates. Even longer with a simple "sudo apt update" every few months and a "composer update" every year or so. The maintenance rate is actually very very low.


Years ago a Digital Ocean virtual server of mine stopped working because I had never upgraded Ubuntu to the newest major version. After a few years, the version of Ubuntu was no longer supported by the Digital Ocean hypervisor and couldn't mount or boot at all.

In my experience, yes you absolutely need maintenance. In the past I've had to upgrade from HTTP to HTTPS, upgrade the OS, upgrade to newer versions of external API and embedded components because the old ones were deprecated, handle a domain registrar shutting down, and then yes absolutely PHP updates and upgrades for security that then start giving you warnings because less secure versions of functions are being deprecated...

And frequently updating the one thing that's broken necessitates upgrading a bunch of other things that breaks other things.

I literally cannot imagine how you would keep a PHP site running on a virtual server for 10 years without any maintenance. I need to address an issue probably roughly once a year.


These are all problems that shouldn’t exist. You have succinctly described the problems with modern IT. Software doesn’t need to have an expiration date. It doesn’t decay or expire. But because of our endless need to change things, rather than just fix bugs, we end up with this precarious tower of cards.

If, as an industry, we focussed on correctness and reliability over features, a lot of these problems would disappear.


But the hardware does expire. Computers aren't just magically "faster" than they were decades ago; they're altogether different under the hood. An immense number of abstractions have held up the image of stability but the reality is that systems with hundreds of cores, deep and wide caches, massively parallel SSDs and NICs, etc. require specialized software compared to their much simpler predecessors to be used effectively. Feature bloat is a major annoyance, and running the old software on new hardware can give the appearance of being much faster, until it locks everything up, or takes forever to download a file, or can't share access to a resource, or thinks it has run out of RAM, or chews up a whole CPU core doing nothing, etc.


Yes, these problems shouldn’t exist. But they do.

One of the big strength of the Web is the commitment of Mozilla „do not break the web“.

But this is hitting its limits, because the scope of Javascript is being expanded more and more (including stuff like filesystem APIs for almost arbitrary file access — as if we had not learned from Java WebStart that that’s a rabbit hole that never stops gifting vulnerabilities), so to keep new features safe despite their much higher security needs, old features are neutered.

I lost a decentralized comment system to similar changes.


I agree there's some truth in what you say. I do think these upgrades are part of a path towards correctness and reliability (bug fixes, security vulnerabilities, etc).


"I have run production websites where I didn't patch security for months or years on end." Linux users wondering why nobody takes them seriously.


Security people on high alert for every possible scenario with no sense of relative risk or attack surface wonder why their concerns aren’t taken seriously.


This. Furthermore, this posture has percolated down to home computing environments (because it is all Windows or Linux) so even my home computer has to receive constant updates as if it’s controlling a Luna lander.


I have a box with nearly 5 years uptime, the one it replaced had at least that much, my experience matches GP's. unattended-upgrades gives you 99% of the patches, a manual upgrade every few months will get you the rest.

If you see a problem with this, why not point it out directly, instead of this snark?


I am puzzled that your site required constant maintenance. I run a similar setup that I hardens using systemd service restrictions with nothing running as root. Then I subscribed to Debian and couple more mail lists with security announcements. It turned out I needed to spend like 20 minutes per month to maintain it.

I also find that PHP works much better than Go regarding maintenance efforts. With Debian I have automatic updates of all PHP dependencies that I need so security announcements is a nice single source of truth. But with Go I would need to setup monitoring of dependencies for updates myself and recompile/deploy the code as necessary.


They didn't say constant maintenance, they said endless maintenance. 20 minutes a month is a never ending commitment of time. They have better things to think about.


Yeah, like developing a résumé.


> The thing is — if you are running this setup on your own linux virtual machine — it requires endless system maintenance. Otherwise all the PHP stuff becomes vulnerable to random hacks. And the base OS needs endless security updates.

Just use Debian with unattended-upgrades. Done. Only rarely do you have to do anything manually.

Setting up daily backups with verification is also a one-time thing.


I tried S3 for my static HTML site but the problem is my personal website gets infrequent visitors and for each visit the first load is always slow probably because my website gets moved from memory to disk. The thing is, my website is less than 20KB and I don't like it to be slow for all the users just because it is infrequent. Managing my own server allows me to lock my website into memory.


Linode's base tier VPS is still $5/month.


A daily apt update/upgrade in crontab has been working fine for me. What is the maintenance beyond that?


The thing is — if you are running this setup on your own linux virtual machine — it requires endless system maintenance.

Yeah, this is exactly what I talk about this post -- why do I used shared hosting?

Comments on Scripting, CGI, and FastCGI - https://www.oilshell.org/blog/2024/06/cgi.html


As always, it's really impressive to see how much technical detail they release publicly in their RCAs. It sets a good example for the industry.

Also — quite impressive to make major infrastructure and architecture changes in a few months. Not every organization can pull that off.


> quite impressive to make major infrastructure and architecture changes in a few months

And have it work first time round.


I feel there’s a sweet spot where if you do it too quickly, it’s a bad sign. And if it takes years, risk just keeps going up and up until it becomes basically impossible to do smoothly.


The blog post contain no details at all about how they achieved high availability. I'm disappointed.


There is not much impressive here. Their architecture seems to be relying on a couple of datacenters, and if somebody have turned on or not the right switches in the right places. This means you will continue to hear from regular outages and maybe nice postmortem blogs. I don't see a fundamental approach to reliability based on concepts around blast radius, expecting anything to fail anytime, and fundamental principles of bulkhead patterns.

Here is a free hint: By talking so much about where their data centers are located, on my view, they already failed item one on my check list. Principle number one of Physical Security is, you don't say where your Data Centers are, except of course to very restricted number of "need to know group".

As predictions have no value, unless they are made prior to events, I predict the next outage will be some common core component, with some on/off type of config, with some common core configuration, that "could not be foreseen". Then to the next blog...On to the next outage....


I write Ruby professionally and while I agree that this can lead to some very sloppy code, I don't find that this causes a lot of problems in practice.

A place where we would use nil conversions is especially in logging and error handling — you can cast anything, including nil, to a string, and that is useful in debugging.

Probably more often than nil conversions, we use the safe navigation operator &. as a way of working with "might be nil" values without having to add too much verboseness.

I do miss the way languages like Swift think about Optional values, where you have to be much more explicit about whether or not you are dealing with `nil`. It's easier to reason about and leads to clearer codepaths. And in general, I miss stronger type systems in Ruby.

But in a professional environment with relatively high developer expertise in the language and relatively clear style guides, we're still very productive in Ruby and don't suffer hugely from the typing issues. I guess eventually you learn to work around the warts in whatever language you use.


I used to write Ruby professionally, and found the lack of any type system and duck typing to cause pretty severe problems. Not knowing if the variable you were dealing with was a String or a URI was great until it very suddenly became not so great.

But this kind of API where you have 'safe' operators that don't throw is a pretty good convenience, since forgetting that a variable might be nil is pretty common (or passing nil into a method that previously never accepted a nil and where the API was never tested for nil).

There's a better way to do it like what Rust does with Some()/None enums and forcing you to unwrap the value and handle the None value explicitly, which is much better than using 'unsafe' operators that just blow up and throw/panic if you don't think about the edge condition (So I wouldn't consider the Python alternative to be any better, you're just trading one possibly bad behavior of an edge condition for another possibly bad behavior of an edge condition and forcing programmers and code reviewers to constantly have them in their mind, and blaming them for when it goes wrong--with one group or the other feeling superior about their choice of when to blame the programmer for forgetting to handle the behavior). I think this is also roughly equivalent to languages like C# 8.0+ that have nillable/non-nillable types and static checking to make sure you're using them right.


> and found the lack of any type system

I'm not surprised you found Ruby uncomfortable when you describe Ruby as not having a type system. It might just not be the right language for you. That's fine.

> Not knowing if the variable you were dealing with was a String or a URI was great until it very suddenly became not so great.

Nothing stops you from declaring that you expect a given type in Ruby; it's just code, and not mandatory, though you can certainly write code to impose mandatory types with Ruby - there was a while when writing those seemed a rite of passage for Ruby developers, but the reality was that they buy far less than people think until they've tried writing one and/or spent some time using them.

> There's a better way to do it like what Rust does with Some()/None enums and forcing you to unwrap the value and handle the None value explicitly

Nothing stops you from doing that in Ruby, and in fact there are several monad implementations for Ruby that provides building blocks for those kinds of patterns if/when you want to use them.


> I'm not surprised you found Ruby uncomfortable when you describe Ruby as not having a type system. It might just not be the right language for you. That's fine.

Thought we were supposed to "respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize" around here and not nitpick misstatements.


I did not suggest you believe that Ruby does not actually have a type system at all, but your statement to me still implies a view about the way Ruby treats typing that suggests to me that it is a fairly natural consequence you will find Ruby uncomfortable. There's no value judgment in that, nor nitpicking or criticism. Just an observation that Ruby might just not be right for you.


I'll take your suggestion along with my 12 years of professional experience with Ruby under advisement.


I too wrote Ruby fulltime and professionally. For over a decade.

The sheer amount of code I wrote, just to protect against nil-errors is staggering. Every third function or method had guard clauses and/or tests. Just to ensure that the data was in the expected shape (that it quacked and walked like a duck). I spent, I think, hundreds maybe thousand hours debugging actual production "nil-errors" and weird conversion errors. That happened in production. Sometimes legacy database records, sometimes weird race conditions causing corrupt data, sometimes (most often) just stupid mistakes by me or another dev.

All of these -all!- would have been caught by a typing system and type checker like e.g. Rusts'. All.

Which then leaves just the huge category of stupid bugs that I'll make in the business logic.

But now, working with rust, the LSP tells within ms that I overlooked something that In Ruby would've been a data-error. Often my Ruby test suite would find these after minutes of running. Sometimes we would find them after deploying to prod and seeing havoc. Sometimes they'd take weeks before popping up.

I'm convinced rust makes me develop much faster because of this.


If you need to test again nil "every third function or method", then that reflects broken abstractions and/or very dated Ruby. For "maybe nil" situations, modern Ruby code will lean heavily into safe navigation, dig, or wrapping values, and sometimes monads or similar.

If that's not enough for you, Ruby has optional type checkers available.


The position of “I regularly don’t know if I have a string or a uri” is indeed a commonly raised issue. It folds out to two other problems:

It implies a lack of conventions around interfaces and variable names. The surrounding statement also implies that it was not easy to figure out quickly (hard to exercise that code synthetically).

Highly successful workflows in loosely typed environments like this depend upon a strict culture to ensure both of these aspects are under constant improvement.

This is difficult to scale, and improving these aspects in other languages also helps other workflows, but their impact in these environments is much higher. It’s part of the reason why you would hear _so much_, density wise, about code patterns, code health, testing and so on in the ecosystem in its peak era, all of those things were really just mechanisms to try to achieve the above factors.


In the grand scheme of things, it would be difficult to come up with something worse than having both null and undefined in JS, while also being able to define variables as undefined.


> A place where we would use nil conversions is especially in logging and error handling — you can cast anything, including nil, to a string, and that is useful in debugging.

Lots of languages can format anything to a string though. Even Java will let you write

    "foo " + null
And logging / error handling / debugging seems like the context where you would least want nil to become an empty string (and thus be confusable with an actual empty string, or not appear in the output).


I agree with the grandparent general point, but i would say that it is string formatting more precisely where these nil conversions are useful.

For example, you can write something like:

    bananas = 4
    puts "you have #{bananas} banana#{'s' if bananas != 1}"
And this works because: `if` is an expression, `if` evaluates to `nil` if the condition is not met and there's no `else`, and `nil.to_s` is an empty string.

> And logging / error handling / debugging seems like the context where you would least want nil to become an empty string

Yep, 100% agree. And i think Ruby's designers agree too, since the language has the shortest method possible, `p`, for print-debugging. You can do `p something` and that'll print a useful string for debugging (`""` for an empty string, `nil` for nil, `[]` for an empty array, etc).


`"foo " + nil` is a TypeError in Ruby. You need to be explicit.


Thank you for completely missing the point, your contribution is appreciated.


The snark is completely unnecessary, and absolutely not appreciated.

What is your point of that example, then? Absolutely nobody imagines any languages has no way of converting any value to a string.

To me it came across as implying you were implying this was equivalent to what Ruby will do with the equivalent expression.


There are many stronger type systems for Ruby. And while that is partially a joke, there are various Optional type implementations, though of course you do then need the discipline of wrapping values as needed.

I don't tend to use it myself because between safe navigation and things like Array() etc. it's rarely needed.


There's something eerily true about TFA's observation that data centers have a big footprint but really low headcount. I work in the Atlanta area, and it's amazing how long you can work in tech without ever setting foot in a data center or even seeing one from the outside. They are truly highly invisible infrastructure, hidden behind more and more layers of abstraction. I'm all for abstractions that don't leak, but I'm not sure it's a purely good thing to get so disconnected from the hardware layer.

I used to tag along with our sysadmin sometimes, back when I worked in a place that ran its own data centers - a decade ago - and they aren't really enjoyable places (noisy, not built for human inhabitants), but there's something nice about seeing the actual hardware that runs your applications. It kind of keeps things grounded.


People have probably seen more data centers than they think, and just not noticed. They just look like large buildings if you pass by on the highway.

There used to be a lot of telephone buildings in the US. AT&T built them right in the middle of communities in many cases, but they attempted to match local architecture. Most people never realized they were there.


Yep. I recently found out the non-descript brick building in the slightly shady part of town actually houses tens of millions of dollars in servers.

It was quite the shock to go from the grimey city alley to a perfectly clean, high tech, and highly secure interior.


So many buildings downtown were getting filled up with telecom and data equipment that the Atlanta city council passed a law about a decade ago to keep it from pushing out all other uses. There are lots of historic buildings that are little more than facades wrapped around these places due to how much fiber runs through downtown making it a very attractive place to locate. The city wants more people downtown so they'd prefer more housing, offices, retail, and hotels (human, not telecom). The law seems to have worked or at least contributed to helping to rebalance uses there.


The ones with security fences stand out more obviously.


Tangential, but you might like this to mutter upon in the shower:

>"Ask HN: At what point do the Cloud's datacenters become national security sites?" [0]

Its an interesting aspect to think about when taking a high level objective look at the importance of Datacenters on all aspects of modern society.

>>"...Companies like AWS, running data centers for the Department of Defense (DoD) and Intelligence Community (IC), demonstrate close collaboration between private entities and defense agencies. The question remains: are major cloud service providers actively involved in a national security strategy to protect the private internet infrastructure that underpins the global economy, or does the responsibility solely rest with individual companies?..."

https://news.ycombinator.com/item?id=38975443


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: