Hacker News new | past | comments | ask | show | jobs | submit | vbtemp's comments login

I wish I had more than one upvote to give - what you just wrote is the truth of the matter.


It's not just you.

The only thing I'm confused about is why more C standards keep coming out. C is what it is. Work with the archaic parts of it as need be. You use C for ultimate cross-platform compatibility (every exotic platform and its mother has an ANSI-C/C99 compiler). If you're able to run on the most bleeding-edge C compiler supporting the most recent WG version of C... then why not just use another language? I say this as someone who loves C. If there are bits that feel old, you just create a DSL to get around these problems that compiles down to C... but you don't change C itself.. Just my perception working with all this for many years now.


For me the big problem with most C competitors is they go overboard. The only alternative I've tried that sparked my fancy was D with -betterC flag. It's unfortunate that it's not its own language, but rather a "profile" of a much much larger language.


"I've turned down job offers because they were in C++"

Yep


C++ is huge and highly interconnected. No matter whether you like it or dislike it, and how experienced you are with software development, you either invest years into becoming good at it, or stay away from C++ jobs.


If I was going to go back and time and offer younger myself career advice, I would probably say "just stick with C++". It was the most popular programming language when I started my career. It will be the most popular programming language when I end my career.

Think about it. Is there any widely-deployed OS not written in C/C++? Any web browsers? Spreadsheets? CAD software? Text editor?


Having as much experience as you do and still conflating "widely used because it's super good" and "widely used because of network effects and inertia" is really puzzling.

Surely you understand that your argument is disingenuous, right? Nobody wants to start over a new OS or a browser in another language due to the huge costs that no corporation nowadays is willing to shoulder (due to their own interests). Otherwise a lot of people would very definitely write OS-es and browsers in Rust, D, V even, Zig, Nim and a good amount of others.

C/C++ were there first, that's all there is to it, it's that simple.


It's kind of like how we all have our favorite podcasts to learn about new things. When it's a topic we don't know much about it's so interesting and we learn so much, but when it's a topic we're experts in, we get exasperated about how inaccurate or misleading it is.



  Location: DMV (DC, Maryland, Virginia)
  Remote: Hybrid preferred, but flexible for any arrangement.
  Willing to relocate: Probably not.
  Technologies: C, Rust, Python in Linux environment. Mostly do distributed systems, network protocol design and analysis, and related work. Worked in Aerospace flight software, embedded systems, space mission operations, etc. "Research Engineer" fits my ideal scope-of-work best. Other non-technical/management/leadership activities not mentioned here. I seek to avoid working in product-oriented "Agile" teams.
  Résumé/CV: On request
  Email: vb.temp@gmail.com
  Other: MS and BS in Computer Science. Security clearance (detail on request)


I just got an update from my realtor that, compared to 2021, houses are now getting 5-10 offers instead of 10-20. Every open house I go to is packed and has an offer deadline at noon the day after (they aren't bluffing - it goes straight to under contract the next day). It's impossible to have any contingencies - even home inspection. Most are going for $50-$100K over asking. This is even with 7% interest rates, the stock market down 20% and tens of thousands of layoffs in my relatively tech (and Amazon) heavy area. The rental market is so tight - there's basically nothing on the market and the few dilapidated homes that go for rent are between $5-7k/mo. And this is late-March 2023.


In the 90s it was typical for a house to be on the market for 3-6 months before selling. Sellers generally received 1 offer.

Currently the nationwide days-on-market is still way below long-term historical norms and even below recent history. https://fred.stlouisfed.org/series/MEDDAYONMARUS


Part of it is the web. Buyers are looking at easily 10x the number of homes they used to, if you include all the ones they look at online.

They have a much better idea of what's available and what's a good match for what they want.


This is huge! I looked at every home listed on Zillow in my area for months before checking out 2 in person and submitting an offer on 1, which was thankfully accepted.


That was before the major housing shortage. The US is short about 6M homes by some measure [2], and the Bay Area is short a staggering quantity all by itself, so much so it has its own Wikipedia article. This remains the fault of onerous zoning rules and NIMBYs. [1]

[edit] No supply + high demand = prices go up. Higher interest rates only help at the margins because it doesn't actually change the demand factor which is that the humans in America need a place to live, and the more they make, the more they can afford to pay. And it certainly doesn't change the supply factor!

[1] https://en.wikipedia.org/wiki/San_Francisco_housing_shortage

[2] https://www.cnbc.com/2021/09/14/america-is-short-more-than-5...


FWIW in my East Bay neighborhood, I'm noticing in the last couple months it's taking closer to a week (rather than a day or 2) for "Coming Soon" real estate signs to change to "Sale Pending".


The most popular city, according to Zillow, is Prairie Village, KS.

https://www.zillow.com/research/most-popular-city-2022-31925...

This is about 1 mile from where I live in Overland Park, KS. PV is all single family homes with near zero apartments. Of course the city is trying to re-zone to put in higher density housing, and the population is vehemently opposed to it; I don't blame them. People move there for that reason. Higher density leads to more crime and other frustrations. People learned their lesson from the pandemic.


Why would higher population density lead to higher crime rates? Do you have a source for that?

Poverty and general desperation leads to higher crime rates as people resort illegal means to get by. And that's what happens when there is not enough housing for people.

In the United States, we have a whole bunch of terrible housing policies, so maybe people are conflating density for poverty. But you can find many examples across Europe and Asia where there is little correlation between density and crime.


It makes a kind of intuitive sense. If a certain % of human interactions are crime, and you create circumstances where there are more interactions (packed night club, large convention, dense city, etc), you could expect more of all kinds of interactions including crime. Whether that's how it works in reality is another question.


See this is exactly what I'm talking about.

- "I want affordable housing!"

- "Hm, ok, how about we build new houses in the places people want to live, increasing supply to meet demand"

- "NOT LIKE THAT"

It's just like the "no take only throw" dog meme.

If you don't want to live next to other people, buy the land around you. If you can't afford the land around you, move further away. There's no reason your tastes should dictate what I can do with my property. I'm sorry, I thought this was America.


[flagged]


Urbanization has been a thing for a long time.


The cultural bifurcation however is increasingly pronounced. Ironically, all the more so as communications technologies improve, because those improvements are concentrated increasingly in dense metropolitian areas. Not necessarily city centres, but the urban metro areas net of suburbs around wealthy regions --- many along the coasts, but also urban areas within otherwise rural states (e.g., the blue-red division within the South and West between cities in those regions and urban areas).

The height of homogenisation probably came about with general availability of rail-based package deliver, direct postal mail delivery, rural electrification and telephone service, major broadcast television and radio networks, and the US Interstate Highway System buildout.

At some point between the mid-1940s and mid-1970s, there wasn't a whole lot of difference between what general culture was available within a broader metro region and a rural area. Broadcast and print culture (e.g., magazines) were fairly uniform (there weren't enough networks to support hugely divergent micro-targeting), goods and communications could spread well. Yes, major cultural hubs such as New York, Los Angeles, and San Francisco had their own specific cultures, and there was information for which you'd probably have to physically visit a major university campus or library to access. But overall, cultural diversity was hitting a low point. Flyover regions might wait a few weeks or months longer for first-run films (distributed as actual film prints), or see less coverage by major performing acts, but that was largely it.

That's been walked back with increases in direct-cabled broadband communications infrastructure, probably starting with cable TV (the fact that cable TV wasn't yet available was a laugh line in Fast Times at Ridgemont High, 1982, with the book it was based on published in 1981).

Now, the distinction between broadband in the tens of Mbit (or lower) vs. 100 Mbit to Gigabit ethernet, 5G vs. 3- or 4-G mobile networks, and general reliability of network infrastructure, as well as installation and maintenance costs, is a major distinguishing feature between first-tier and tertiary markets. In a world where work is performed remotely, being able to participate in full-video conferencing or download or upload content matters.

Maybe those differences will get ironed out. But as with postal mail, intercity rail and bus service, interstate highways, telecoms, and electricity, it all but certainly won't be market mechanisms on their own which do so. All of the listed services I've just named were brought to rural regions through government programs, projects, organisations, and/or subsidies.


Indeed, but the speed of it has accelerated massively in the last two decades, particularly with "smart farming" and big agriculture investors buying up entire areas and consolidating the fields, and by the aforementioned globalization allowing the devastation of mining and heavy industry as it was now possible to shift all of that to China and other places with lax environmental regulations.


> In the 90s it was typical for a house to be on the market for 3-6 months before selling. Sellers generally received 1 offer.

I wonder if there's been a material shift in who is making offers that won't revert to 90s behavior. Specifically, investment firms specializing in single family housing, who might have more incentives to make a lot of lowball offers and not worry too much whether they win or lose a specific bid.

Obviously you have imitators like Zillow trying to edge in as well, and step back after taking heavy losses, so its hard to say for sure where the new equilibrim will end up at.


I worry about exactly this. For every house an investment company buys, how many did they bid on which drove the price up?


Prices are set on the margin. Loosing bids have relatively little effect.


Yea, I doubt they're moving prices much, but if people are using "offers recieved" as a metric...


> Specifically, investment firms specializing in single family housing

I dont think I have seen good data that includes or calls out "investors" rather than "investment firms" I suspect that a lot of property ends up in the hands of individuals be it a vacation home, rental, air BB or flip.


Interest rates in the 90s started in double digits and ended just below that. Also things that weren't mortgages (like cars) still had double digit interest. In 1999 I paid 13% interest for my car.


That must be specific to your market. That's not what I'm seeing in the Denver area, speaking as someone who has been casually looking for a house for over a year.

As soon as interest rates started rising, conditions went from what you describe, to, in a lot of cases, houses sitting weeks or months on the market. They go through, sometimes, multiple prices drops before going under contract.

I'm speaking specifically of homes that are in the range of those I'm considering purchasing, so maybe it doesn't apply to the overall market. For me, though, it is now much more of a buyer's market than it was a year ago.


I observed that happening for about a 6-8 week period in Oct-Nov-Dec 2022, then the party returned like it was 2021 all over again.


January 15th the slowdown was over in my market and now it's back to red hot.


That's good to know. I'm hoping those conditions don't come back here for awhile.

Getting into spring, I could see more people start looking again, with some pent-up demand not met from last year.


That's because people are still not shedding the mentality of hoping to get top dollar. The market is going back in the buyer's direction. You need to be looking at recent comparable sales and pricing at or below surrounding current inventory if you want to move it quick. The mania did a number on people's perception of how the housing market should work. It will pick back up just as soon as the Fed officially backs off on rate increases, or even makes a slight cut, likely soon here.


I'm near you in Boulder area and I am seeing the same thing, huge drop in interest on houses in the market, last year places would get under contract within a week, now its houses sitting on the market for weeks and even months. I talked to our realtor(we are planning on moving again for a work related job) and she said average days on the market has doubled from last year in the Boulder metro area.


I'm in Boulder, too, and you can't ignore the "housing stock" problem here. There are a lot of crap houses in boulder that really need to be demolished. A late 90s house on my street sold in 1 day over asking, while an '87 house with a mid aughts renovation has been sitting fort 6 months. They were priced very differently relative to quality. I think what's happening here is not that the market has cooled overall, but is reflecting the vast annoyance that is boulder's building department, mixed with bad housing stock.


I would say, even further, here in the rust belt there is a whole other problem:

Inventory is gone. Not low. Gone. In the 45 mile radius we are looking in for an area with a pop of ~100,000, at houses under $350k (90% of listings)...there are weeks with ZERO listings.

This has lead to an insane situation where offers end up way over asking but there are so few comps that virtually all FHA is back out, not because of issues with the offer but because the appraisal issue: no house actually apprises near the price it goes for.

If you take a look at Zillow, most houses that sold last in 2019 after a renovation, with zero changes since, are listing at $330k. This has crunched the rental market as one would expect. Rental prices, in absolutely a low cost of living area have multiplied overnight from ~$700 to $1500+. I know most in this area are absolutely not paid like us so I'm pretty sure the area is about to hit a wall. One might expect a market to limit itself as homes that cost too much wont sell..but the other side of it is that I think we are getting to a flexible point of 'well everyone needs a home', I mean...is this a housing bubble coming? When that bubble pops, what happens to all the people with $350K loans on houses that are back down $200K?


I think that’s the downside of raising interest rates. Obviously I’m no economist, but if rates are high and I’ve already got a home I will not move. If people don’t move inventory is low and needs to be built. High interest rates probably put a crimp on new buildout. Just my thoughts.


Homeowners sometimes die, get divorced or have to move to a different area. Number of sales will go down, but not to zero. The prices of those sold due to exigencies will drop because of inflation.


I left the Bay Area in 2020 and started working remotely because I didn't want to enter the real estate market at the top, right before a crash.

I've never believed in doing something just because everyone else thinks its the default thing to do. Putting 1.5M to 2M into a small average house didn't make sense to me when that is enough money to retire in many different beautiful places in the world.

So I left. Switched jobs and started working remotely. I now live in a MCOL area and things are about 30% cheaper than the Bay Area. Better schools. Nicer environment. More diversity (including economic diversity). More access to activities (like gymnastics) for my kids at a reasonable price. More free time.

I make a little less money in cash. And a lot less money in equity. But all in all its a better life.

I hope more people open their eyes and realize these tech hotspots are actually hell.


I think that critiques of the Bay Area are overstated (probably because of cognitive dissonance). Yes, it's extremely expensive, but it's also very nice. If you can afford it I don't think there's any inherent reason to expect to like anywhere else more. Of course if you can't afford it then, yes, you should probably leave, but that's not some crazy revelation.


I found Raleigh NC area nicer than Bay Area, more greenery, easier traffic. Yes, more boring, but you can experience real redneck rodeos ...


> I hope more people open their eyes and realize these tech hotspots are actually hell.

Wouldn’t that make the other currently nice places hell again :^)


It's a good thing that there SO many MORE nice places... See Montana\Idaho plenty of room, plenty of broadband (even FIBER).


You bet!


I'm curious where you live that you find more diversity? I also moved out of the bay in 2020, have lived a few places since and the diversity of the bay area is pretty much the only thing I miss.


Diversity in the bay area? Isn’t it mostly non-religious liberal Democrats?


Yeah, but at least they aren't all white. Where I live now is also mostly non-religious liberal Democrats, but they all think ranch dressing is spicy.


What market?

Anecdotally, I follow many smaller markets in the Midwest and the South (plus Chicago). I wouldn't call it a buyer's market anywhere. But I'm not seeing anything like this.

I also have a few friends shopping on the East coast - and they're in a similar situation. It's not a buyer's market. But also not crazy.

My understanding was that The Bay, SoCal, and Seattle had massively slowed down. So I'm just wondering where this could be - or if things have recently turned around a lot.

Maybe my info is just bad...


The house around the corner from me in Cupertino just sold. It's 1700sq ft on a small lot, and they were asking $3M for it. It sold for $3.2M four days after the open house. It sold so quickly they cancelled the second open house. Apparently they got 20+ offers.

So at least here in Cupertino, things are still nuts.


Cupertino is a unique case as you have extremely rich Chinese investors as well as apple HQ in that city. Also it is known to have the best schools in the bay area besides Palo Alto. But yeah totally crazy nonetheless.


Interestingly only 10% of Apple employees live in Cupertino. And we actually don't have many foreign investors, Chinese or otherwise, as most houses are owner-occupied. But you're right about the schools!


$3.2M for a measly 1700sq ft?! Nuts. You can have a bigger, probably better house near the new Apple campus in Austin -walking distance, through a beautiful park no less!- for $600k or $700k.


You pay $700k for the house and $2.5M for the weather, stable power grid, and lower taxes (yes, people in Cupertino pay lower overall taxes than people in Austin).


> people in Cupertino pay lower overall taxes than people in Austin

Only because most people who can afford to buy in Cupertino aren't paying for it with earned income, right? Google tells me a $2.5M mortgage has minimum payments of $230k a year, anyone who makes enough to afford that is probably paying at least $50k/year in CA state income tax, on top of ~$25k in property taxes. Austin has high property taxes but I don't see them coming anywhere close to that, and the sales tax is lower.


Property taxes are a lot lower in California because of Prop 13. For a new buyer, they are about the same. But after just a few years, the Texas property tax will be much higher because California property tax pretty much doesn't go up after you buy. Also energy taxes soak Texans because they use a lot more energy than in California due to the hotter summers and colder winters.


> For a new buyer, they are about the same. But after just a few years, the Texas property tax will be much higher

But do you have 1,700sqft shacks going for $3.2M in Texas ?


One might pay less in taxes in Cupertino, but if the mortgage service is 3.5x what it would be in Austin, then one is still paying a tremendous amount in what amounts to a tax of sorts. Sure, if you have a mortgage you'll be building equity, but also paying more interest than you'll ever build in equity, and you'll depend on ever-increasing property values to begin to make this a worthwhile trade-off.


How do they pay lower overall taxes?


Here is an article on it:

https://www.yahoo.com/video/everything-bigger-texas-includin...

But basically the higher property taxes and additional taxes and fees that Texan's pay on consumption, like energy taxes, end up more than making up for the lack of income tax, unless you're a top 1% earner (which is somewhere around $700k/yr). Energy tax is a big one because Texan's use a lot more energy since they have hotter summers and colder winters.


The question was about a Cupertino homeowner vs what that corresponding homeowner would see in Austin. You're linking an article about the median Californian vs median Texan, which -- even accepting its assumptions -- is only superficially relevant, and very unrepresentative for the comparison under discussion.

Even at the lower rate, the property taxes on the Cupertino house alone dwarf any tax difference they would pay for a comparable lifestyle in Austin [1].

To be blunt, but it looks like you deliberately chose a misleading comparison to push the narrative you already settled on. That's below the standard I've come to expect of your comments.

[1] Here's a $3.2m Cupertino home, which shows $40k/year in property taxes. A comparable home in Austin for the $700k the parent mentioned would have ~$14k/year in property taxes. https://www.realtor.com/realestateandhomes-detail/21467-Krzi...

So, paying the $2.5 million ... to pay $26k/year more just in property taxes.


To which one must add interest if one has a mortgage. It's nuts. Not I'm wondering what the shelter situation is like for the bottom 10% in California vs. Texas. And if affordable shelter == long commute, one has to value that time, and the mileage and gas or public transportation costs, etc.


"About the only Texans who fare better than Californians are the wealthy. The top 1% of earners in Texas — those making $617,900 or more annually — only pay 3.1% of their income in state and local taxes. That compares to a rate of 12.4% for top earners in California, who make $714,400 or more per year."

So if you're a top 1% or more, you save at least 4X in taxes. I'm sure you come out on top in TX financially for the vast majority of people reading this site, $100k+ earners. Everyone I know that moved to Austin from CA did so to avoid paying taxes on cashing out stock options.


> Everyone I know that moved to Austin from CA did so to avoid paying taxes on cashing out stock options.

And a lot of those people are now moving back after that one time tax savings. Heck even Elon is moving back.

> So if you're a top 1% or more, you save at least 4X in taxes

Only income taxes, but property and energy taxes are still higher.


If you don't consider higher cost of shelter a sort of tax...

The cost of shelter for an Apple employee in Cupertino is what, 3.5x the cost of shelter for an Apple employee in Austin? Nuts.


I think these people making these choices are in another universe from most of us working folk, so understanding this behavior from one data point isn’t going to make sense. This could even be a second convenience home. It can be hard to wrap your head around that much wealth if you’re in more down to earth class.


I haven't met the new owners yet, but most of my other newer neighbors are not what you'd consider wealthy. It's mostly families with two working adults where one works or worked at FAANG and sometimes both. Most people's story start with "my FAANG stock made my down payment, and the salary pays the mortgage".

The older neighbors are generally people who have owned for 30+ years and bought when things were still reasonable. The houses in our neighborhood are all about 60 years old, and we even still have some original owners!

But the point is for the most part people here are not so wealthy they can stop working, they just have high paying local jobs to pay for their really expensive house and don't have a lot left over after housing expenses.


Spending 3.2 million on a home is in another league of wealth from the vast majority of people.


they probably could stop working, they just wouldn't be able to afford a house in Curpertino.


Yeah but the weather....


I love the weather in Austin. Not very cold in the winter, and I love the dry heat of the summer.


San Francisco condos are down a lot, many going for less than $1000/sqft now. But that's also the political climate and the increase in crime that is contributing to all that. I might come in when condo prices fall to about $600/sqft.


I heard a similar story about Bellevue and Sammamish, WA. North Seattle-area (Edmonds, Lynnwood, Mill Creek, etc) have seen a price decline.


Many of the Northern VA counties and certain others in the DC metro area.


Things cooled somewhat in the fall but have definitely rebounded. On Friday the 10th I placed my townhouse in Loudoun on the market. Choose from 17 offers on Sunday evening. Accepted an all-cash offer with a quick closing for well above asking. Moving on Saturday.


I've been looking around the suburbs of Philadelphia, it's been a frenzy since summer of 2020. I guess it's slightly better in the sense that houses stay on the market for a few days instead of a few hours, but it's still absurd. My uncle, who is also in the area, put his home up for sale about a month ago and got a multimillion dollar offer same day without anyone even looking at the place. Even though prices have tripled and mortgage rates have doubled, a mortgage is still cheaper than rent.


entirely anecdota but a house me in the peninsula bay area, got bought for 1million (3/2) 8 months ago. got reno'ed- listed for 1.698 a month ago- went for 1.755


I have noticed the same trend where I live and among my close group of acquaintances and friends. Some couples are buying houses way north of 600K even while still having a mortgage on a perfectly nice home a few blocks away. My wife and I look at each other completely puzzled. Like what is going on? Are people making this much money? Or are they getting into terrible deals due to FOMO? We could certainly afford a new home now but no way in hell we are jumping on this market, our money goes to buy discounted stock or sits on CDs.


It could be an inflation hedge. You buy a house at a fixed rate, you pay your $5K/mo mortgage for 30 years. Meanwhile inflation makes this $5K less and less in real value. But you can rent it out for more and more.


in other words, shorting the dollar. it's the same reason you buy just about any asset that's not bonds or cash AKA real assets. People know, it's only a matter of time before the feds default on the value of the dollar.


No, they just slowly devalue it. Though not so slowly in the last couple of years.

Default is different, it's an inability to pay debts. They have no problem printing more dollars and paying debts.


i mean by having inflation run hot at 5-10% for many years or decades, they're defaulting on their obligation to provide a stable value currency for the people.


There's no such obligation. USD purchasing power has been dropping pretty steadily.


With savings & investments so volatile people are taking riskier bets in order to diversify


Also FOMO.

Stress makes people have a harder time filtering and looking at the long term.


> I just got an update from my realtor that, compared to 2021, houses are now getting 5-10 offers instead of 10-20.

Well, Realtors are incentivized to portray the market hotter than it actually is.


They are portraying the market exactly as it is. Reality is speaking for itself, not the realtors.


OP was talking about realtors. Realtors commission is based on final price, why not portraying the market hotter?


Because if you make the seller think the market is hotter than it is, then they price the house too high, and the house doesn't sell. That results in $0 commission.

I could see them telling buyers that the market is hotter than it is. But even that might backfire, if they make qualified buyers think that they can't afford to buy in the current market.


Okay, then presumably you can just appeal to the real basis for that claim, rather than the unreliable realtor source?


I've been to the open houses and I've been watching Redfin and seen the houses come open on Thursday and pending on Monday.


Which areas? Austin this is not the case anymore and I thought this was one of the “hotter” areas in the US.


Pacific Northwest


What is the average selling price? Median for my county is ~$325k and our experience matched yours exactly up through and including the $800s. You really had to be looking at stuff 3-3.5x the median price to have a reasonable experience where you could do an inspection or expect them to replace the 19 year old furnace (or at least credit most of a new unit).


Hmm, wait for a couple of months and see how the value of the homes fall !!


We hear this ALLLLL the time. I've heard it for nearly half a decade as I waited and waited and waited forever to buy a house, and yet through all of the disasters, house prices just kept going up and up and up.

The assumption is, since almost no one can afford a house, prices MUST fall. Here's the thing no one gets: the "almost" part is very important. If, in a city of 100,000 people, only the richest 100 can afford (less than 1% of the population) a house but there's only 10 houses on the market, then prices can still continue to skyrocket.

Overall demand does not need to be super high for prices to go up. It's Demand relative to supply that matters. And in markets where supply is really really tiny then demand doesn't need to be very high in order to outstrip supply.


I've felt kind of similarly. My wife and I are young--graduated college in 2018--and I feel like I've been a bit lied to about what constitutes "good financial decisions" over the past couple years. We've been squirreling away cash to have an emergency fund, long enough to support us being out of work for 6months, paying down her student loans, trying to wait on buying a car, etc--but our friends have been buying houses, cars, etc etc, and it seems like that's been the more prudent decision time and time again.

We delayed buying a house to have a bit more in savings? House prices skyrocket, our friends who leveraged the crap out of themselves look genius.

We wait to buy a car to have a bit more in savings / wait for the used market to come down? The used market goes up, our friends who bought new cars look like geniuses.

There's part of me that keeps waiting for a correction, esp. in the housing market, both so that houses come back into our budget range, but also maybe because there's part of me that feels vindictive about the fact that everyone who, to my sensibilities, seems to be acting recklessly are making out better than we are.

Some of that is reasonable, I suspect: we're very financially conservative relative to our peer group, which means we're going to miss out on some opportunities, and I don't want to be "proven right" in saving for a rainy day by everyone else having economic hardship, but I do feel rather confused about how we're "supposed" to behave in this market.

We're also remarkably well off, as is our peer group, which will obviously skew the data radically, but it also scares the crap out of me: if these are the thoughts we're having with a household income just barely under $200k this young, what is everyone else thinking?


I know someone from FSU (former Soviet Union). Their grandparents saved for decades, literally keeping their cash under their mattress. The grandmother wanted to buy a car but the grandfather said it’s better to save. One day they woke up in December of 1991 and that money was almost worthless. “We should have bought a car”, that grandmother never let her husband live it down.

Financial decisions are not so black and white. IMO only basic rules apply: live below your means, save more than you spend, avoid debt, etc. Any advice beyond that is a crapshoot.


There's two kinds of spending: assets and liabilities. assets make you richer, liabilities make you poorer. Car is the later.

It's okay to save every penny but you can't keep it in cash. You've got to buy some real asset: either real estate, gold, equities or bitcoin.


It is VERY typical for volume to drop, while prices peak in high demand areas before volume drops to zero.


Banks push this to the limit by making it comparatively expensive for individual buyers to build, preferring to offer loans for already overpriced homes.

Edit: At least this has been my experience. Have people found otherwise themselves?


How dare you think critically about economics!


The funny thing is, people were saying that in 2021... and then... interest rates doubled/tripled crippling affordability, people's net worth plummeted due to decrease in stock prices, and more people got laid off... And everything keeps on trucking straight through it. I visited my realtor when she was showing a house for sale, and it was astounding the traffic and people coming up whispering to her how they wanted to make an offer that moment. And then on top of that the offers that came in with escalation clauses, waiving of all contingencies, etc. And this is March 2023.


I’m in a suburb of LA and while prices are still up quite a bit from mid 2020, houses in my immediate neighborhood are selling for hundreds of thousands less than they did at peak.


In hot markets with thin inventory there are still enough affluent buyers who can afford to pay cash using gains from other investments or family wealth. Prices are set at the margins so it only takes a few such buyers in each neighborhood to keep market prices high. Ultimately people need somewhere to live, and if they have to be in a particular area due to work or family obligations then they'll pay whatever it costs regardless of the underlying value.


Solving for the equilibrium though, if the prices fall, more buyers will be available to compete, so it ends up right where it was before.


With interest rates rising, the volume of buyers is dropping. In high demand areas, the folks who got lucky with a windfall or whatever, can still buy.

But the normal buyers will get more and more priced out, and the affluent area will end up with less and less volume as sellers start waiting it out because the volume of available buyers gets thinner and selling gets riskier.


People will not sell at a loss bc they locked in to 2% mortgages for 30 years so no


The same phenomenon happened in 2008. I think it boils down to the question of whether or not sellers are motivated at scale.

In 08, the poor labor market meant that many people had to sell. That caused prices to drop.

We aren't at that point yet this time, and I don't think anyone knows whether or not we are headed for a seriously rocky job market.


Yeah but a lot of those 08 mortgages had much higher rates and even variable rates in the subprime division. Labor market is holding strong rn so even if it happens it’s gonna take much longer than a few months


They were giving mortgages to people they knew couldn't afford them in 08. I have not seen any evidence of that here.


I think the implication here is if you lose your job and can’t find it for a while you suddenly can’t afford your mortgage either


The job market is still historically EXTREMELY strong. We would need unemployment to hit over 5-6% before mortgages became an issue.


If you mean dentists, lawyers, and doctors. Yeah, they were giving too many prime borrowers with great credit scores way more debt than they could sufficiently manage.


I don't think 2008 was nearly the same in terms of the fixed rate mortgage positioning. A seller in 2008 could likely move and get a mortgage at a practically the same rate as the one they were leaving. That meant that trading down in housing cost would improve their cashflow.

Sellers now with a 2.75% fixed-30 mortgage could trade down a significant amount in housing cost and end up with worse cashflow if they take on a new mortgage at 5+%.


> 2017 home prices are unsustainable!

> 2019 it has to crash sometime

> 2022 ok now is the time for home prices to call

> 2027 it must crash!

The reality is that home prices are buoyed by a number of systemic factors that cause supply to be constrained. Prices will go down in total value due to interest rate rises. Homes will not be more affordable.


> Prices will go down in total value due to interest rate rises.

As somebody with the cash, I'd settle for even seeing this much. So far it's been high interest rates with the same high prices (in much if the Northeast).


Pressure is on supply, demand relatively strong as ever.

Supply: low and expensive new builds, heavy rate lockin.

Demand: muted due to higher rates, but sometimes people gotta move. So a slight haircut, but in many areas most likely less haircut than supply constraint.

We are seeing some sifting in the market but I personally believe sales volume is driven by supply lockin, meaning its still a seller's market.


I'm not saying it won't happen but people have been saying exactly this for a decade.


That doesnt make sense... interest rates have been low for a decade. Why would falling prices be predicted?


There's a general sense that housing is not affordable for the middle class who does not already own a home. You predict falling prices with the thought that it's not sustainable and some changes in public policy will occur, as we will not allow for middle class people to be unable to afford a home.


It is sustainable, though, if you don’t assume homeownership rates have to be constant-or-rising.


Supply remains constrained, now with people not wanting to take on higher interest rates.

Move rates are roughly 9-10% a year. I bet we see that a lot lower for this quarter and next few quarters.


I think this is generally correct. This is the situation in my area: Prices can never fall, because if the situation was such that prices would fall, no one sells. So there's basically little to nothing available, so buyers are stuck competing no matter the macroeconomic situation.


Even in ‘08 this was the case.

That’s why short sales, foreclosures, ‘mailing the keys in’, etc. were a thing.

With housing, it’s NEVER a thing that someone just sells at a loss just because.

They sell at a loss because they don’t have a choice, and the bigger the loss, usually because the less choice they have.

If labor market is still doing well, then there is little pressure. Usually that happens a bit later anyway - construction workers out of work because housing isn’t being built, or folks employed by tech workers get laid off because tech isn’t so sure about those bonuses, etc.

We’ll see though - maybe the fed will pull off a soft landing this time.


I just mean in a very general sense "just wait until the housing market collapses!" is a very common refrain, absent any evidence to prompt it.


what are they going to wait for? the rate of increase to decline 5 percent?


I bought into a new community northwest of Houston, closed last summer. It's about 75% built out, and the new houses are sitting available for some time.


Anectdotal, but the same is happening in the mid-Willamette Valley, Oregon. We've recently seen houses listed and sell next day for over asking. How those prices line up to historic averages, I can't say, but after a maybe 5 month cooling period, I can say things have definitely picked up in our neighborhood. We have two close realtor friends and they confirm the same.


Theres a house on my street in palo alto thats been sitting on the market for some 3-4 months.

Previous owner died, someone bought it and cleaned it up to flip it. It actually looks very nice.

No one is buying it though.


A) Tune out anything realtors tell you if you want to be an intelligent consumer. B) Volume has collapsed because people in homes with low rate morts can't move. These prices are "less solid" because liquidity is poor.

In Austin prices are def. down ~ 8%, after this hike and sustained layoffs I am betting it will shed another 8 by the end of summer...


All of that to buy something like a 2 bed, 1 bath SFH of 735 sqft built 76 years ago for $1M (https://www.realtor.com/realestateandhomes-detail/1761-Geary...).


Higher interest rates lower demand, but they also lower supply.

In our area, age on the market is increasing and I think the average for some areas is over 60 days, which is an indicator for a buyers' market.


Inventory is still below demand in most places. People are just freaking out because the mania has subsided. I'd call this current market about normal.


What state, region, market are you in?


That’s because it’s worse, but not really bad yet.


I assume you are in the Seattle metro area?


I follow the market for Seattle East side pretty closely and prices have definitely dropped compared to a year ago. A ton of houses that were listed with ridiculous early 2022 prices have been re-listed multiple times to lower prices, in many cases dropping the price by well over 6 figures. One example a house in my neighborhood initially listed for 1.8m ended up selling for 1.45m


No, which goes to say the price-escalations and white-hot housing markets are still a thing across North America, straight through the tech crash, bond crash, layoffs, and now banking crisis.


There are obviously huge distortions in the housing market.

- how many buyers are investors from private equity or foreign investors vs residents (who intend to live in the property)?

- What is the true population including undocumented people? this will be off by 30-50% in some areas

- What is the true vacancy rate? In my area there are many houses sitting vacant. I’m assuming they are investments or tax dodges

It baffles me at how critical the housing market is and that no one takes responsibility to understand what is affecting prices. We just accept prices as a force of nature.


We have to remember that Rob Henderson is a performer running his shtick. At first I really liked him, and in general agree with his assessments of things.

But he's just another guy that has to turn tricks with some new variant of "durr hurr liberal plan for X is elitist and a luxury belief."

It's a little personal with him, because right before the 2020 election he posted something pretentious like "In 2016 I bet and won a lot of money on the outcome of the election, and I bet even more this time" (Implying his non-elitist background gave him the clairvoyance that Trump would win. I asked him who he bet on this time. Then the results came in and Trump was clearly losing in 2020, and then he blocked me.

In the end, even if we agree with him, he's just another culture warrior that whose livelihood depends on feeding the angry masses outrage at whatever Team Blue/Team Red is doing.


It's just unfortunately becoming clear Python is going the way of C++.


I'm in a job now that mostly involves bare Linux network programming in C. The only dependencies are CMake, VS Code with C/C++ plugin, and the regular Linux libraries. No "sprints" or "Agile" or "tickets" or cruft like that - just delivering incremental value to customers at a reliable pace. Gosh I love it and it brings me straight back to my happy place. I once worked a job adjacent to modern web development and front-end stuff and man, that stuff just sucks and is not fun.


I have a question I've always wanted to know but too embarrassed to ask (Especially because I've extensively used C for well over a decade now and am intimately familiar with it):

Who exactly are these new C-standards for?

I interact and use C on an almost daily basis. But almost always ANSI C (and sometimes C99). This is because every platform, architecture, etc has at least an ANSI C compiler in common so it serves as the least common-denominator to make platform-independent code. As such it also serves as a good target for DSLs as a sort of portable-assembly. But when you don't need that, what's the motivation to use C then? If your team is up-to-date enough to quickly adopt C23, then why not just use Rust or (heaven forbid, C++23)?

I'd love to hear from someone who does actively use "modern" C. I would love to be a "modern C" developer - I just don't and can't see its purpose.


> Who exactly are these new C-standards for?

An example: The C11 memory model + <stdatomic.h> + many compilers supporting C11 has/had a positive impact on language runtimes. Portable CAS!

> If your team is up-to-date enough to quickly adopt C23, then why not just use Rust or (heaven forbid, C++23)?

Another example: If you're programming e.g. non-internet-connected atomic clocks with weather sensors like those produced by La Crosse, then there's no real security model to define, so retraining an entire team to use Rust wouldn't make much sense. (And, yes, I know that Rust brings with it more than just memory safety, but the semantic overhead comes at a cost.)

Another example: Writing the firmware to drive an ADC and broker communication with an OS driver.

Another example: The next Furby!


Atomic is one of the few things in C11 I like most. Unfortunately, it is an optional feature along with threading [1]. It is not portable. In the end, I am still using gcc/clang's __sync or __atomic builtins.

[1] https://en.wikipedia.org/wiki/C11_(C_standard_revision)#Opti...


<stdatomic.h> is provided by GCC (not the libc), so I expect it to be available everywhere the atomic builtins are supported.

I prefer the builtins. With _Atomic you can easily get seq-cst behavior by mistake, and the <stdatomic.h> interfaces are strictly speaking only valid for _Atomic types.


> Unfortunately, it is an optional feature along with threading [1]. It is not portable.

Portability isn't binary. It's the result of work being done behind-the-scenes to provide support for a common construct on a variety of hardware and operating systems. It's a spectrum.

GCC certainly is portable, but it doesn't support every ISA and OS. Over time it has even dropped support for several.

Random thoughts since I'm still in the process of waking up…

1. Most of <stdint.h> is optional

2. long is 64 bits on Tru64 Unix which is valid under all versions of the standard


Requiring C11 is itself a portability issue.

I can easily whip up CAS for all platforms I care about, plus a few more for bonus, using nothing but C99 or C90 (and have actually done it before).

Possibly, without even using any language extensions, unless I want the operations inlined.


Did you manage to write a truly portable compare-and-swap in standard C? Or did the code just happen to seem to work on your platform?

I'd be surprised if the former were possible. From a quick web search, C doesn't even give you the guarantees necessary for Peterson's Algorithm, [0][1] and volatile doesn't help. [2][3]

[0] https://codereview.stackexchange.com/a/124683

[1] https://stackoverflow.com/questions/35527557/errors-with-pet...

[2] https://web.archive.org/web/20160525000152/https://software....

[3] https://old.reddit.com/r/programming/comments/d457c/volatile...


I think the point by kazinator still stands. C11 is still a portability issue (which was the root of my original question above). ANSI C is the de-facto baseline everything has in common, using C11+ narrows which platforms you're able to build for. Even if C11 does provide the primitives for correct implementation of Peterson's Algorithm, what about the other platforms that don't have a C11 compiler - it does them no good. As to the point more directly, a lot of the C code I've seen is for real time and embedded systems that are usually time and memory partitioned, and do not have the same concerns regarding concurrency.

I guess if I could rephrase my original question: People who are going to adopt C23 - who are you and what field/industry/line-of-work are you in?

Asking because in my line-of-work, C is ubiquitous and I personally love coding in C, but anything beyond ANSI C (or C99) is "cool" but undermines the point of C as I've used it, which is its use cross-platform for a huge set of common and uncommon architectures and instruction sets. If something only needs to run on common, conventional platforms, C, however much I love it, would no longer necessarily be a strong contender in light of many alternatives. It seems like these standards target an ever shrinking audience (much smaller than the whole universe of software developers working in C).


  > ANSI C is the de-facto baseline everything has in common,
  > using C11+ narrows which platforms you're able to build for.
Sure, but which platforms are you thinking of? I think you're overestimating the proportion of C codebases that care about targeting highly exotic hardware platforms. GCC can target all sorts of ISAs, including embedded platforms. It can't target, say, the 6502, or PIC, or Z80, but they're small niches.

I'm not an embedded software developer though, perhaps there are more developers using niche C compilers than I realise.

  > Even if C11 does provide the primitives for correct implementation
  > of Peterson's Algorithm, what about the other platforms that don't
  > have a C11 compiler - it does them no good.
If portable atomics/synchronisation machinery is not offered by your C compiler or its libraries, I figure your options are:

1. Use platform-specific atomics/synchronisation functionality

2. Leverage compiler-specific guarantees to write platform-specific atomics/synchronisation functionality, if your compiler offers such guarantees

3. Write your atomics/synchronisation functionality in assembly, and call it from C

Here's a project that uses all 3 approaches. [0] (It's old-school C++ rather than C, but it's the same in principle.)

I'm fairly sure it's not possible to implement your own portable synchronisation primitives in standard-compliant C90 code. As I understand it, the C90 standard has nothing at all to say on concurrency. It's possible that such an attempt might happen to work, on some given platform, but it would be 'correct by coincidence', rather than by definition. (Again, unless the particular compiler guaranteed the necessary properties.)

[0] https://github.com/gogglesguy/fox/blob/fe99324/lib/FXAtomic....


Neither. The code wasn't standard C, and it didn't just "happen" to work, either.

You probably wouldn't want to build an atomic compare-exchange in standard C, even if it were possible; you find out what the hardware provides and work with that.


  > The code wasn't standard C, and it didn't just "happen" to work
Thanks, that makes sense.

At the risk of sounding pedantic, you did say using nothing but C99 or C90, implying use of standard features only.

  > You probably wouldn't want to build an atomic compare-exchange in
  > standard C, even if it were possible; you find out what the
  > hardware provides and work with that.
Agreed.


Right; I could do it using nothing but standard C features in C source files, by defining some C compatible assembly language routines. The compare_swap primitive can be an external function, e.g.:

  bool cmpswap64(uint64_t *location, uint64_t if_old, uint64_t then_new);
Code that relies on the header doesn't have to process anything compiler-specific. FFI could be used to bind to that function from non-C languages.


Interesting discussion, lots of good points all around. Thanks to you both.


In my case: because writing C code (specifically C99 or later - designated init and compound literals!) gives me joy in a way that neither C++ nor Rust provide (C++ was my go-to language for nearly two decade between ca. 1998 and 2017), and I tinkered with Rust a couple of years ago, enough that I realized that I don't much enjoy it.

IMHO, both C++ and Rust feel too much like puzzle solving ("how do I solve this problem in *C++*" or "how do I solve this problem in *Rust*?"), when writing C code, the programming language disappears and it simply becomes "how do I solve this problem?").

PS: I agree that the C standard isn't all that relevant in practice though, you still need to build and test your code across the relevant compilers.


Maybe we just work on different kinds of software, but I feel like I'm actually solving problems in Rust when I'm using it. I don't have to think about all the terrible string manipulation APIs and how they can come back and bite me, who owns what is something I still have to decide except that the compiler actually helps out, and I have access to nice APIs that solve ancillary problems for me already (e.g., rayon, serde, etc.). I can't wait for the day when another parser will never be written in C again.

In C, I feel like I'm building a house out of tinker toys, C++ is Lego Techniks, and Rust I'm using bricks and mortar. FWIW, Python feels like waterballoons and drywall to me; while it might look OK from the outside, one thing pierces your exterior and things tend to sag sadly from there.


> terrible string manipulation APIs

Don't use (most of) the C stdlib, it's useless and hopelessly antiquated (not just for string manipulation), instead use 3rd party libs.

> who owns what is something I still have to decide except that the compiler actually helps out

Lifetime management for dynamically allocated memory in C should also be wrapped in libraries and not left to the library user. In general, well designed C library APIs replace "high level abstractions" in other languages.

But I agree, the memory safety aspect of Rust is great, and I'd love a small language similar to C, but with a Rust-like borrow checker (but without all the stdlib bells'n'whistles that come with it, like Box<>, RefCell<>, Rc<>, Arc<>, etc etc etc...) - instead such a language should try to discourage the user from complex dynamic memory management scenarios in the first place.

It's not the memory safety in Rust that turns me off, but the rest of the 'kitchen sink' (for instance the heavy functional-programming influence).


> use 3rd party libs

And now I have more problems ;) . (And I develop on CMake; consistently using external deps is a nightmare.)

The thing is that I usually am that library author (or working in an area that acts like that).

I'm not sure what you expect to be left if you say "the stack is all you can use" (which is what I understand to be remaining when you remove those "bells'n'whistles").

I also really enjoy the functional aspects. I don't want to think about what some of the iterator-based code I've done looks like in C or C++ (even with ranges).


> "the stack is all you can use" (which is what I understand to be remaining when you remove those "bells'n'whistles")

Not what I meant, heap allocations are allowed (although the stack should be preferred if possible), but ideally only with long lifetimes and stable locations (e.g. pre-allocated at program startup, and alive until the program shuts down), and you need a robust solution for spatial and temporal memory safety, like generation-counted index handles: https://floooh.github.io/2018/06/17/handles-vs-pointers.html).


Yeah, I feel like we work in completely different realms of programming. Which is fine; there are plenty of languages out there for all kinds of use cases. FWIW, pre-allocation absolutely doesn't work when you support loading GB-size datasets. You certainly can't leave it sitting around forever either once it is loaded.


What is your favorite 3rd party string library? I'm aware of several, but haven't used any of them in anger.


"IMHO, both C++ and Rust feel too much like puzzle solving ("how do I solve this problem in C++" or "how do I solve this problem in Rust?"), when writing C code, the programming language disappears and it simply becomes "how do I solve this problem?")."

This statement very much resonates with me. It's honestly one of the things I like about C. Although it's not perfectly like this for me all the time. For example string manipulation is not great.

An other aspect I like about C is there is not a plethora ways of doing the same thing which I have found always made it more readable than rust and C++.


In contrast, when I write C, I spend far too much time thinking "how do I solve this problem without causing undefined behavior?"


That what UBSAN is for (along with ASAN, TSAN, static analyzers and compiler warnings, just dial everything to eleven and you can offload a lot of that thinking to the compiler - it's not the 1990's anymore ;)


UB sanitizers can only show that your code has undefined behavior, not that it does not. And the results are only as good as your tests. Those sanitizers are also not available with old embedded toolchains.

I do dial up the warnings to 11, yet it is not enough.

I've written C code that's currently running on hardware orbiting the earth. I'll never do it again if I can help it; it wasn't worth the stress. You only get one chance to get it right.


> I've written C code that's currently running on hardware orbiting the earth.

I guess in such a situation I would not not trust any compiler (for any programming language), no matter how 'safe' it claims to be, but instead carefully audit and test every single assembly instruction in the compiler output ;)


I'm in the WG14, and I, like you, only use c89. So why does c23 matter? Well in terms of features it matters very little but a big part of wg14s work is clarifying omissions from previous standards. So when c23 specifies something that has been unclear for 30+ years, compiler developers back port it in to older versions of C where it was simply unclear. It matters a lot for things like the memory model and things like that.


> compiler developers back port it in to older versions of C where it was simply unclear

You cannot rely on that. If you're maintaining C90 code, with a C90 compiler or compilation mode, you should go by what is written ISO 9899:1990, plus whatever the platform itself documents.

We actually don't want compiler writers mucking with the support for older dialects to try to modernize it. It's a backward-compatibility feature; if you muck with backward compatibility features, you risk breaking ... backward compatibility!


C cares a hell of a lot about backwards compatibility. Whenever there is a corner case that gets fixed, the number one goal is to retain compatibility. most of the time, these clarifications clarify what everyone is already doing and has been doing for decades.

Also, most of these corner cases are so obscure that the vast majority of people with decades of C experience have not encountered them. C is an extremely explored space.


At least for C++ there is something called defect reports. When agreed, those defect reports to retroactively applied to previously published C++ standards.

As a random example for something as fundamental as classes in C++, the page https://en.cppreference.com/w/cpp/language/classes shows ten defect reports.


That's very interesting. Thank you.


Existing C code needs to be maintained, and can take advantage of the newer features when available in the compiler. The Linux kernel is moving to C11, and may move to C17/C23 later. Also not everyone wants to put up with the compilation times, object sizes, and aesthetics of Rust.

As for new developments, see for example https://news.ycombinator.com/item?id=33675462 which uses C11.


I doubt that the kernel will adopt the C++ memory model (the big change in C11). Instead, they will keep doing their own thing. Given the problems with the memory model, I can't really fault them. But framing this in terms of standards versions is a bit of a stretch. They could easily adopt additional GCC extensions over time as they move minimum compiler versions forward. Standardization does not really matter there.


> Who exactly are these new C-standards for?

For me and for many colleagues in my lab? C is quite big in scientific computing and signal processing. Fortran would be slightly better, and it is widely used, but not directly around me. The C99 standard, which added complex numbers and variable length arrays, was truly a godsend in the field. I cannot imagine working without it.

If you write a numerical algorithm that needs to be run 15 years from now, then C and Fortran are possibly the sanest choices. If you do something in other, fancier, languages, you can be sure that your code will stop working in a few years.

The new C standards are really minor changes to the language, and they happen in the span of a decade. It is quite easy to be up to date. And in the rare case that your old code stops compiling, the previous (less than a handful) versions of the language are all readily available as compiler options in all compilers. You can be reasonably sure that a C program written today will still compile and run in 20 years. You can be 100% sure that a python+numpy program won't. If you care about this (for example, if you are writing a new linear algebra algorithm to factor matrices), then choosing C is a rational, natural choice.


> You can be 100% sure that a python+numpy program won't.

It's possible to use a phyton+numpy program in 20 years, but you also have to save the entire environment and make sure that it works air-gapped (otherwise external dependencies would fail). One possiblity would be to store it as a qemu virtual machine. It's very possible today to boot up stuff as VMs that is 20 years and older (e.g. 20 year old Linux distros or Windows XP iso from early 2000s).


If your team is up-to-date enough to quickly adopt C23, then why not just use Rust

There's a lot of reasons to use C23 over rust

- multiple compiler implementations

- works on more platforms

- defined standard

- ability to create self-referential data structures without hacky workarounds

- immediate, easy access to large numbers of C libraries

(For the record I like rust, but the evangelism over the past half decade has been pretty ridiculous. Consider this counter propaganda).


Keep in mind that if you want to write probably-maybe-correct code, Rust is maturing to be able to get you there more easily than C. But if you want actually-correct code, you need to do the legwork regardless of language; and C has a much more mature ecosystem (things like CompCert C, etc) that lets you do much of the analysis portion of that legwork on C code, instead of on generated assembly code as you'd have to do for Rust. Combined with verification costs that don't vary that much from language to language, and there's a long future where, for safety-critical applications, there's no downside to C -- the cost of verification and analysis swamps the cost of writing the code, and the cost of qualifying a new language's toolchain would be absurd. For this reason, C has a long, long future as one of the few languages (along with Ada, where some folk are making a real investment in tool qualification) for critical code; and even if it takes a decade for C23 features to stabilize and make it to this population, well, we'll still be writing C code well beyond '33.


  > Combined with verification costs that don't vary that much from
  > language to language, and there's a long future where, for
  > safety-critical applications, there's no downside to C -- the cost
  > of verification and analysis swamps the cost of writing the code
That doesn't sound right. You really want to get the code right early on. The later bugs are discovered, the more costly the fix. You may have to restart your testing, for instance.

If the language helps you avoid writing bugs in the first place, that should translate to quicker delivery and lower costs, as well as a reduced probability of bugs making it to production. The Ada folks are understandably keen to emphasise this in their promotional material.

  > the cost of qualifying a new language's toolchain would be absurd
As I understand it, this typically falls to the compiler vendor, not to the people who use the compiler. A compiler vendor targeting safety-critical applications will want to get their compiler certified, e.g. [0]. To my knowledge we're nowhere near a certified Rust compiler, although it seems some folks are trying. [1]

[0] https://www.ghs.com/products/compiler.html

[1] https://ferrous-systems.com/blog/sealed-rust-the-pitch/


Are you asking about greenfield development only? One big obvious reason to use C23 instead of Rust or C++23 is if you already have a codebase written in C. Switching to C23 is a compiler flag; switching to Rust is a complete rewrite.


Places that are just now adopting C11 will probably adopt C23 in 12 years? C++ is (unfortunately, IMO) making inroads into embedded, but C is also still pretty widely used.


My usages are similar to yours, but new C standards still benefit me because I can opportunistically detect and make use of new features in a configure script.

To use my baby as an example: free_sized(void *ptr, size_t alloc_size) is new in C23. I can detect whether or not it's available and use it if so. If it's not available, I can just fall back to free() and get the same semantics, at some performance or safety cost.


The free_sized function is kind of a bad example, though. For years to come, using free_sized will break malloc interposition. Interposed mallocs will not support free_sized initially. (It's currently not in jemalloc, tcmalloc, mimalloc as far as I can see.) If we add it to glibc and your application picks it up, calls to free_sized will end up with the glibc allocator even if malloc/free/… have been interposed. Maybe there is a way to paper over this in the glibc implementation of free_sized (rather than calling free unconditionally), and still do something useful for the glibc allocator. I don't know.


> Maybe there is a way to paper over this in the glibc implementation of free_sized (rather than calling free unconditionally), and still do something useful for the glibc allocator. I don't know.

We emailed about this a little contemporaneously (thread "Sized deallocation for C", from February), and I think we came to the conclusion that glibc can make interposition work seamlessly even for interposed allocators lacking free_sized, by checking (in glibc's free_sized) if the glibc malloc/calloc/realloc has been called, and redirecting to free if it hasn't. (The poorly-named "Application Life-Cycle" section of the paper).


I don't fully understand the need or benefit of having free_sized() available tbh.

Spec says it's functionally equivalent to free(ptr) or undefined:

If ptr is a null pointer or the result obtained from a call to malloc, realloc, or calloc, where size size is equal to the requested allocation size, this function is equivalent to free(ptr). Otherwise, the behavior is undefined

Even the recommended practice does not really clarify things:

Implementations may provide extensions to query the usable size of an allocation, or to determine the usable size of the allocation that would result if a request for some other size were to succeed. Such implementations should allow passing the resulting usable size as the size parameter, and provide functionality equivalent to free in such cases

When would someone use this instead of simply free(ptr) ?


> I don't fully understand the need or benefit of having free_sized() available tbh.

It's a performance optimization. Allocator implementations spend quite a lot of time in free() matching the provided pointer to the correct size bucket (as to why they don't have something like a ptr->bucket hash table, IDK, maybe it would consume too much memory overhead particularly for small allocations?). With free_sized() this step can be jumped over.


Thanks for your insights, which prompted to actually jump into the malloc.c implementation.


Many many many teams writing C won't be using C23 the day it's out, but they have to get these changes in now if they want the people who always use a 10 year old standard to have these features available 10 years from now


I'm also a C developer, but I do use the more modern versions.

There are four big reasons why:

* Atomics. These are the biggest missing feature in older C.

* Static asserts. I can't tell you how much I love being able to put in a static assert to ensure that my code doesn't compile if I forget to update things. For example, I'll often have static constant arrays tied to the values in an enum. If I update the enum, I want my code to refuse to compile until I update the array. I have 20 instances of static asserts in my current project.

* `max_align_t`. It's super useful to have a type that has the maximum alignment possible on the architecture.

* `alignof()` and friends. It's super useful to get the alignment of various types. Combined with `max_align_t`, it is actually possible to safely write allocators in C. Previously, it wasn't really possible to do safely or portably. And I have at least three allocators in my current project.

You're right that C11 doesn't have nearly the reach the ANSI C does, but it does have slightly more than Rust, much more if you consider Rust's tier 3 support to be iffy, which I do.

And it does have one HUGE advantage against Rust: compile times. On my 16-core machine, I can do a full rebuild in 2.5 seconds. If I changed one file in Rust, it might take that long just to compile that one file.

That's not to say Rust is without advantages; one of my allocators is designed to give me as much of Rust's borrow checker as possible, on top of API's designed around that fact.

tl;dr: I use modern C for a few features not found in C89, for the slightly better platform support against Rust, and for the fast compiles.


Except for max_align_t (which is broken even for scalar types on some targets, and doesn't help with vector types by design), all these things were available long before standardization. So I'm not sure if this is a compelling argument for standardization.


Without standardization, I have to rely on specific compilers. That's not great, either.


Outside our bubble, there’s an _ocean_ of embedded software/firmware and lower level library stuff, on up-to-date platforms, written in C by people or teams that would find switching to Rust just a _massive_ chore. I’d guess there is at least an order of magnitude more of this than Rust.

And I certainly appreciated C11 when writing Objective-C, so I’m sure people with large codebases of ObjC will appreciate it (though most will be using Swift for new features nowadays).


One thing I'm really looking forward to is standardization of binary literals. Bitwise masking makes a lot more sense with binary literals than hex literals.

Example:

https://pasteboard.co/VkjrJIOZzaiR.jpg

(Sorry for pasting code as an image, I'm on my phone)


Old software is very slow and expensive to change. Adopting a new C version doesn't need a failure prone expensive synchronized collective-action rewrite throughout your sectors supply chain, new tooling, platform runtime ports, etc. Rust would.


C provides the only stable ABI for Rust, and changes to the C++ ABI may also occur in the future. So the implications of new C standards for library code are especially relevant.


There is no "modern" C but "C with additional niceties". And those additions are usually low key enough to be adopted by a good portion of the compilers out there.

When you have a C code base or experience with C those features may be enough not to make a complex transition.

Having a simple tool evolve a bit may be what you need as opposed to making the change to a much more complex tool.


I believe the current WG14 charter is here: https://www.open-std.org/jtc1/sc22/wg14/www/docs/n2611.htm

This text is still in force, it seems:

“ 13. Unlike for C99, the consensus at the London meeting was that there should be no invention, without exception. Only those features that have a history and are in common use by a commercial implementation should be considered. Also there must be care to standardize these features in a way that would make the Standard and the commercial implementation compatible. ”

I read this as saying that anything that gets standardized should be available in one of the major implementations. In practice, most of the qualifying features will have been implemented in both GCC and Clang in the same way, so for most users, there is not much benefit from standardization. Some may feel compelled to support the ”standard” way and the “GCC/Clang” way in the same sources, using a macro, but that isn't much of a win in most cases. Of course, there will be shops that say, “we can't use feature until it's in the standard”, but that never really made sense to me.

Things are considerably murky on the library side. In my experience, library features rarely get standardized in the same way they are already deployed: names change, types change, behavioral requirements are subtly different. (Maybe this is my bias from the library side because I see more such issues.) For programmers, the problem of course is that typical applications do not get exposed to different compiler versions at run time, but it's common for this to happen with the system libraries. This means that the renaming churn that appears to be inherent to standardization causes real problems.

Others have said that new standards are an opportunity to clarify old and ambiguous wording, but in many cases the ambiguity hides unresolved conflict (read: different behavior in existing implementations) in the standardization committee. It's really hard to revise the wording without making things worse, see realloc.

So I'm also not sure what value standardization brings to users of GCC and Clang. Maybe it's different for those who use other compilers. But if standardization is the only way these other vendors implement widely used GCC and Clang extensions (or interfaces common to the major non-embedded C libraries), then the development & support mode for these other implementations does not seem quite right.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: