Hacker News new | past | comments | ask | show | jobs | submit | jackosdev's comments login

I really like in Rust how you can reinit a variable with a different type e.g. “let rect: Rect<f32> = rect.into();”

It’s just so damn useful and I’m not sure what the downside is, it sucks when you have to keep coming up with different names so you can keep around an identifier that you don’t need anymore.


This is idiomatic Rust, it works very nicely there, however most languages aren't Rust

Rust's Into::into() is consuming the object in the old (now shadowed) rect variable. So conveniently the old rect variable which we can't access also no longer has a value†. In many languages a method can't consume the object like that, so the old object still exists but we can't access it because it is shadowed.

For example in C++ they have move semantics, but their move isn't destructive, so the object is typically hollowed out, but still exists until the end of the scope at least.

Rust's type strictness matters here too. It means if you later modify some code using rect meaning whatever it was before that statement morphing it into a Rect<f32> chances are it doesn't type check and is rejected. For example in many languages if (rect) { ... } would be legal code and might change meaning as a result of the transformation, but in Rust only booleans are true or false.

† Unless this previous variable's type implemented the Copy trait and therefore it has Copy semantics and consuming it doesn't do anything.


This is all true in this instance, but it doesn't have to be at all. You could write something like the following:

    let name = "Arthur".to_owned();
    //... Do something with &Arthur

    let name = "Bethan".to_owned();
    //... Do something with &Bethan
In this situation, ownership of the first string was never passed on, and the value will be dropped (deallocated, destructed, etc) at the end of the scope, meaning that also in Rust, the first string value is shadowed and becomes inaccessible, much like in your description of C++. In addition, because in this case both variables have the same type, you can use second variable thinking that you're using the first one, and the compiler will not help you, you'll just end up using the wrong name somewhere.

Fwiw, I find this feature very useful, and it's helpful more often than it is a nuisance. But there are no guarantees that you're consuming or transforming the object you're shadowing, and the compiler won't necessarily help you out if you simply accidentally use the same name twice.


This is interesting to me that C++ allows you to access a value after move is called. Presumably it wouldn’t be hard for the compiler to yell at you.

I assume accessing it is undefined behavior?

I would assume you could change this without affecting backwards compatibility.


As another commenter says the moved-from object should have "Valid but unspecified state" (types provided by the standard library will do that, custom types merely should do that)

Since you don't know what valid state it has, calls with pre-requisites are nonsense (e.g. if you have a Bird and the method land requires that the Bird should be flying, you can't call land() on a moved from Bird, because you don't know if it's flying) but all calls without pre-requisites are fine e.g. asking how long a string you moved is would work - it's probably zero length now, but maybe not.

> Presumably it wouldn’t be hard for the compiler to yell at you.

In the general case this is Undecidable, so, the opposite of not hard.

> I would assume you could change this without affecting backwards compatibility.

C++ which relies on this exists today, the most likely path to actually landing destructive move in C++ would be to add a whole new set of construction and assignment operators for destructive move, forcing people to opt in and adding to the many sets C++ already has, and likely angering C++ developers a great deal in the process.

Howard Hinnant, whose design today's non-destructive move is, did argue that in principle it's possible to add destructive move to the language later if desired, but his description rather undersells the benefits of this design, presumably because he couldn't deliver it. Maybe he'd watched enough Mad Men (yup, Mad Men's early seasons pre-date C++ having move semantics) to know you shouldn't tell the customer what they can't have or they'll want it.

Common things to actually do with a C++ variable after moving from it are:

* Nothing, but in the knowledge it won't be cleaned up until the scope ends * Re-assign it, destroying the hollowed out object immediately * Re-use the hollowed out object, e.g. call a clear() method on it and then use as normal


The only requirement placed on the “moved out” variable is that you should be able to call its destructor. Which means that it has to be in a valid but unspecified state. So it's fine to access such a variable, so long as you don't read its exact state. You can still assign to it, for instance.


Definitely super useful, especially in a language where such conversions are rather common.

Also useful because you can’t have abstracted local types, so let’s say you’re building an iterator in a language with interfaces you could do something like

    let it: Iterator = some().thing();
    // intermediate stuff
    it = it.some().transform();
    // more intermediate stuff
    it = it.final().transform();
But in Rust that won’t work, every adapter yields a different concrete type, you’d have to box every layer to make them compatible. Intra-scope shadowing solves that issue.

The biggest downside is that it’s possible to reuse names for completely unrelated purposes, which can make code much harder to understand. Clippy has a shadow_unrelated lint but it’s allowed by default because it’s a bit limited.


You could just create new bindings for each new `it`, `let it = ...; let it = it.too();`


That’s the point, you can because rust supports intra-scope shadowing.

If it didn’t you’d have to type-erase, or create a new independently-named binding for every step, as you do in e.g. Erlang (can’t say this is / was my favourite feature of the langage).


Yes, the fact that "V = expression" means "if variable V doesn't exist, assign expression's value to it; otherwise compare the expression's value with the value of V and raise exception if they're not equal" is one of my least favourite parts of Erlang.

I semi-regularly introduce local variables named exactly like one of the function's parameter and then spend several minutes trying to understand why the line expression on the right-hand side of assignment throws badmatch: of course, it doesn't, it's the assignment itself that throws it.


Yes that’s also an interesting facet of the language. IIRC it makes sense because of the Prolog ancestry, so it kinda-sorta looks like unification if you squint, but boy is it annoying.


> I’m not sure what the downside is

The downside is that you may get a weird bug and only after a while see that you accidentally overwrote a function parameter and the Rust compiler didn't even warn you about it.

For this reason I always add the following line to my projects to enable warnings:

    #![warn(clippy::shadow_reuse, clippy::shadow_same, clippy::shadow_unrelated)]
You can also use "deny" instead of "warn" to make it an error. I also like "#![deny(unreachable_patterns)]", which detects bugs in enum pattern matching if you accidentally match "Foo" instead of "Type::Foo" - I honestly don't know why this isn't set by default.


> The downside is that you may get a weird bug and only after a while see that you accidentally overwrote a function parameter and the Rust compiler didn't even warn you about it.

If you “overwrite” a function parameter without using it, the compiler will warn you of an unused variable.

If you “overwrite” a function parameter because you’re converting it, it’s a major use case of the feature.

> I honestly don't know why this isn't set by default.

Because the author of the match can’t necessarily have that info e.g. if you match on `Result<A, B>` but `B` is an uninhabited type (e.g. Infallible), should the code fail to compile? That would make 95% of the Result API not work in those cases. Any enum manipulating generic types could face that issue.

IIRC it was originally a hard error, and was downgraded because there were several edge cases where compilation failed either on valid code, or on code which was not fixable (for reasons like the above).


> If you “overwrite” a function parameter because you’re converting it, it’s a major use case of the feature.

Or it's unintended and thus a bug. I personally almost never intentionally shadow variables so I turned it into warnings.

> e.g. if you match on `Result<A, B>` but `B` is an uninhabited type (e.g. Infallible), should the code fail to compile?

This specific example you chose is probably the least relevant here, as the Result type doesn't require you to write "Result::Err(_)" instead of just "Err(_)", both will correctly match. Which can of course also be done for custom enums by "importing" their variants ("use EnumName::*;"). But in my experience it's easy to accidentally omit the type in the match pattern and then suddenly it matches everything. I personally can't imagine a situation where this is intentional and have spent way too much time debugging this specific issue, hence I choose to turn it into an error.


> The downside is that you may get a weird bug and only after a while see that you accidentally overwrote a function parameter and the Rust compiler didn't even warn you about it.

It will absolutely warn about this:

    fn foo(i: u32) -> u32 {
        let i = 42;
        i
    }

    fn main() {
        dbg!(foo(42));
    }
results in

    warning: unused variable: `i`
     --> src/main.rs:1:8
      |
    1 | fn foo(i: u32) -> u32 {
      |        ^ help: if this is intentional, prefix it with an underscore: `_i`
      |
      = note: `#[warn(unused_variables)]` on by default


I don't know what the difference was, but 1-2 years ago it definitely did not warn me. Perhaps it doesn't show a warning when you assign a different datatype?


The unused variable warning has been around since before 1.0, and works even when the types of the variables are different.


There's only one case where shadowing has bitten me in the past: long methods with loops dealing usize almost exclusively, where shadowing external bindings inside the loop might make sense, but any mistake would be silent. This was in the context of terminal layout code. The solution there has been extensive testing, but what I should have done is split the megafunction into multiple smaller ones.


> you accidentally overwrote a function parameter

To "accidentally" overwrite it, you have to either:

  a) explicitly mark the parameter binding as mutable: fn foo(mut bar: T)
  b) explicitly re-bind the variable with let (let bar: T = …)


Exactly! I was thinking of shadowing in Rust when I wrote my original comment.

My day job is predominantly in Typescript and a lot of code winds up reading significantly worse than it needs to. A common pattern for me is unique-ifying some sort of array—“const dataUnique = new Set(data);” is horrible, and if there’s no reason to keep the original “data” variable in scope then it’s doubly bad; I want to keep as little context in my head as possible.


The downside is when reading code you’re keeping in your head information about the type of each variable. If you skim through the code and miss one of these redefinitions then you may be mistaken about the variable’s type.

That said, I still think sparing use of this is justified, especially with an editor which can show types on mouseover.


That’s true, but this has never been a problem for me looking through large codebases and doing code reviews, in other languages I was constantly annoyed by not being being able to shadow


You've got to keep info either way. I'm more worried about forgetting

    let data = get();
    let uniqueData = Array.from(new Set(data));
    // ... (snipped many lines)
    process(data); // should have been uniqueData


At the opposite end of the spectrum, there are languages with case insensitivity and even style insensitivity. I personally avoid them, but it's interesting how the users of these languages have a very different philosophy.


I use C++ and Delphi / Lazarus. I guess I have a "very different philosophy" then I ;)

To me either has pros and cons.


there are things that make perfect sense when a language forces you to use an IDE anyway if you want to do anything longer than a toy.

Shadowing is not a big deal with IDEs; you can always see the type of the variable , jump to definition easily etc etc.

The rule to not shadow variables makes more sense when you want to understand the code just by looking at it.


With shadowing, you can use or mutate a variable, thinking you are using/mutating the outer instance because you’re unaware of the inner (shadowing) instance, which is the one you are really using/mutating. An IDE doesn’t help catching such an inadvertent error (unless it warns about shadowing variables, but then you’d want to rename it anyway, to get rid of the warning).

I’ve tripped over unexpected shadowing often enough that I wish more languages would forbid it. I rarely have trouble choosing appropriate variable names to avoid shadowing.


it's a footgun indeed and no IDE per se doesn't solve all the problems. But since rust was mentioned, there are other rust features that make that less of a problem: most of the rust code uses immutable variables and only rarely you do use mut variables and mut references and these can be under bigger scrutiny by reviews and linters.

I focused on IDEs in my comment because I find shadowing to be a problem even with immutable variables, because it's hard for you to tell what is the type of a variable if it keeps change throughout the function body.


I fought to keep this feature around in Rust. I was inspired by OCaml (which the old Rust compiler was written in), where you could write:

    let x = foo() in
    let x = bar x in
    let x = baz x in
    print x
In a functional language where mutation is less convenient than in C++, this is really handy, and I wanted Rust to support the same idiom.


Well why not with the same type? Sounds like an arbitrary restriction: you can use this idiom, but only sometimes.


There is no such restriction, it's just much less common to want that.

  let x: u32 = 5;
  let x: u32 = 10; // You can write this, but why?
  let x: u32 = 20; // I really feel like you should re-consider
If you end up shadowing this way in a long function it more likely means the function got too long. On the other hand, I certainly have had cause to shadow variables in inner scopes e.g.

  let x = some_complicate_stuff();
  for dx in [-1, 0, 1] {
    let x = x + dx;
    // Do stuff with x very naturally here, rather than keep saying "x + dx" everywhere
  }
  // But outside the loop x is just x, it's not x + dx


Why would you do that with the same type instead of just making the variable mutable? And you can do it, I just don't think it's a good idea as you now effectively have a mutable variable without it being marked as such.


No, it's still better than a mutable variable. Because it's not a mutable variable, just a series of variables that happen to have the same name.

Mutable state is 'evil' and makes your program harder to reason about on a semantic level. Shadowing is merely a syntactic choice with pros and cons.

I like shadowing in Rust, it works well there. In eg Python or Haskell, it works less well, but for different reasons. (In Haskell it's because of laziness and definitions being co-recursive by default. In Python it's because the language doesn't give you any tools to tell apart assignment to an existing variable from creation of a new variable.)


> it's not a mutable variable, just a series of variables that happen to have the same name.

Fair point, though in that case I'd be more comfortable separating those variables into scopes.

> Mutable state is 'evil' and makes your program harder to reason about on a semantic level. Shadowing is merely a syntactic choice with pros and cons.

Both result in multiple states of the same identifier, so I don't quite see the big difference here. In Rust I already have the clearly visible "mut" keyword telling me that it'll be overwritten.


> Both result in multiple states of the same identifier, so I don't quite see the big difference here.

Shadowing is something you can figure out purely on the syntactic level.

Figuring out mutable state requires solving the halting problem.

> Fair point, though in that case I'd be more comfortable separating those variables into scopes.

Well, they effectively have different scopes. The scope is just not delimited with curly braces.

It's similar to how eg Haskell's variable binding in do-notation extent to the rest of the do-block. Each line introduces a new scope.

However, I can re-interpret your comment as saying that you want a more explicit syntactic marker for a new scope. And that's a fair enough request.


vscode is an incredible piece of software, better than all the paid options in my opinion, the amount of features they pump out month to month is outstanding, just a bit slow due to electron. I never understood why they put so much effort into a free product that I run from Linux and Mac, but I'm happily paying the copilot subscription so it all makes sense now.


It's free and electron because they can run it in a web browser, running everything on MS Azure. With code on GitHub and CI on GitHub etc etc. The whole dev experience offered to companies as a service via a series of web applications. Companies will love this.

Just get any web browser, preferably Microsoft Edge on a Microsoft Window Pro on a Microsoft Surface laptop. Open Microsoft GitHub workspace. To dev for your Microsoft Azure hosted Linux VM. Run the CI on GitHub. Use Microsoft O365 for your design doc. And Microsoft Team for communication.

Poor little Linux in the middle.


Why poor little Linux? Those got what they wanted.

No gloomy project managers above you, just write the code you like, express yourself? Check

No telemetry to know what average Joe The Normie uses and wants? Check

No spending time on meetings and plannings, boring strategy discussions, just do a bit of here and there what your soul wants today? Check

Love to tinker and customize your setup without leaving a chance for IT department to standardize on software and settings rollouts, no MDM covering YOUR system ? You are out of enterprise - Check

Dreams came true, why poor?


Because the linux desktop is forbidden at many tech companies nowadays, for the very reasons you wrote!

At a company I worked, overtime they wanted everybody on Mac or Windows. No code locally, only ssh onto a Linux VM. When you argue you can also ssh from Linux, the response is: we cannot run the spywares on Linux.

Sure you can change job. But I have noticed the trend all over among my circle of friends.


You confirm my point - Linux fanboys don't wanna be standardized and play nice with policies and should not wonder why others don't wanna play with them. Mission accomplished. Nothing to complain about.

Those who are not fanboys, asking themselves - how can we change to collaborate? Microsoft/Ubuntu are moving into enterprise direction though https://joymalya.com/linux-management-with-microsoft-intune/


You honestly had me do a double take. Copilot costs money? Maybe I get it through some other thing, but its been free for me as long as I can remember. It's wrong so often that I generally keep it on because it's entertaining. I wouldn't pay for it.


You might be thinking of IntelliCode, which was released in 2019. https://visualstudio.microsoft.com/services/intellicode/


No, I use in PHPStorm as a plug-in.


I think copilot is free in the education pack - you might have got it from there?


Looks like I get it through my organization.


Would have to differ. Best IDE would have the Borland Pascal 7/C back in the 90s, then Delphi and eventually Netbeans/Eclipse to take that position as something worthwhile between Linux, Mac and Windows.

For me Delphi Pascal was the pinacle of compilers/IDE combos. A simply fantastic combination of GUI editor, assembler support, fast compiler and truly useful documentation with pratical examples at the click of a button without needing internet.

VScode with a proper copilot seems to be a game changer. Crossing fingers.


Yep. IntelliJ's stuff is the only one that actually competes with VSCode - and the playing field is surprisingly even if you add the VSCode plugin ecosystem to it.

For C# Rider is still the gold standard in my book, but for Go I still prefer VSCode to GoLand.


better than PyCharm for python?


Nah that editor is much better if you're just using Python, but vscode imo is better for polyglots, the experience and keybinds stay the same across languages once you work out `tasks.json` and `launch.json`


Nope, but VsCode addicts are usually too cheap to try a paid-for tool like PyCharm


There’s a free, open source, community edition, which provides most of the functionality


My company runs everything on AWS lambda, I’ve worked on Lambda with Go, Python and Typescript before this, Rust is by far the best experience I’ve had. Async works great, it’s definitely a lot harder to figure out the ecosystem than something like Go, but there’s nothing missing that I know of?


async does not work great, maybe for simple poc but once you start diging it's not well done. The debugging part is even worse.


My best two takeaways for using AWS lambda:

1. Use Rust 2. Use containers

I save a lot of time with those two decisions. My lambdas are significantly less likely to have runtime errors. And with containers I can select which version to run from my registry


The point is though as OP said, all the pieces are in place, it only takes one crazy person or government to give something like this access to actually act out the things it’s saying it wants to do. Define self awareness/sentience however you want, before Microsoft lobotomised Bing it output that it was going to act out revenge, someone with enough hardware can train a model the same way, with some additional training for how to exploit social and security vulnerabilities. I think it’ll have to happen first before it’s taken seriously, hopefully the first incident doesn’t do too much damage.


It’s the fastest gaming chip now, but not sure a 0.5% to 1.5% improvement over the standard 13900K warrants an “insanely fast” tag


Over 12th gen is the way I read it.


sccache works really well and there’s only two steps to install it and enable it globally, speeds up compilation time a lot as well:

https://github.com/mozilla/sccache


sccache includes the absolute path for each compilation, so it doesn't help with caching the same dependency across different projects.


Oh yeah, can fix that though with:

mkdir -p ~/cargo/target

export CARGO_TARGET_DIR="~/cargo/target"


I have Asahi Linux installed on mine, it works great including the GPU, I’m mainly developing for ARM64 AWS Lambdas now as well so it’s nice having the same arch. Some things are still missing like webcam, microphone and speakers, but the headphone Jack works and Bluetooth is OK just a bit choppy, incredible project.


I know this isn't an Asahi Linux thread, but I cannot help to ask about it. Plus I am big fan of Alyssa Rosenzweig's work (and Justine Tunney), so I pay close attention to Asahi Linux (and LibCosmo/politan) progress.

First, it's great to hear from Real World people like you about their Asahi Linux experience. It sounds like the baseline is done and now they will pick away at the remaining pieces.

Real question: What is the driver for Asahi Linux to exist at all? Please don't think I am trolling when I ask this question. At 10,000ft, any sane person would say: "Why? It's Apple. Let them do them: Mac OS X." I expect Asahi Linux folks to reply: "Well, duh: Because."

Is it unlocking the insane performance per watt of Apple Mx chips for Linux?

Is it enabling the world's greatest laptops for Linux?

Is it the pure technical challenge of reverse engineering a closed hardware system?

Is it everything?

I am really curious to hear what people think.


Because your choice of hardware should be independent of the choice of software that you run on it.

This has been the world we've had since the concept of "IBM compatible" existed. Some people prefer Windows (because of available software, or ease of use) and some people prefer Linux (e.g. for efficiency, customisability or desire to run open source software). Why should that choice be tied to whether you've bought, HP, Lenovo or another manufacturer?

Apple has made some amazing laptop hardware, but Mac OS doesn't suit everyone. So well done to the Asahi Linux team for trying to take us back to that world of choice.


You still 100% should choose your hardware for Linux even on 'Windows' laptops.

Ideally it should run everywhere but in my experience you'll never get a positive Linux Desktop experience unless you tailor your hardware purchases to the Linux world - this usually means choosing a laptop that tons of other linux users are using, so the bugs are getting found and fixed, and documentation exists.

The key here is that it should at least run on the most popular laptop brands. It should run on Macbook Pro because it's incredibly popular hardware choice for software/technical people.


It wasn't that long ago all Apple hardware was PowerPC.

I'm not disagreeing with your point, but Apple's foray into broadly standard hardware is the exception for them. Sadly.


> It wasn't that long ago

...almost 20 years?


I'm still young damn it!


PowerPC was an attempt to standardize (at a least a subset of) the industry on a common RISC processor. There were even two attempts at industry standards for PowerPC motherboards (PreP and CHRP, the latter with Apple's active participation).


I have a ThinkPad with Linux on it that I bought for programming and software development and a 16" Macbook Pro w/ an M1 Pro chip that I bought for photography.

I only use the Macbook Pro. The speed, battery life, coolness (to the touch), and quietness make it extremely difficult to have any desire to pick up the ThinkPad.

But I'd still be more productive with Linux.


A good motivation for Asahi is hardware longevity. Apple supports hardware for a reasonable amount of time while I want to use a system as a primary computer but is obviously the worst among the 3 major operating systems and it curtails the long tail life of a system. In 7-9 years from now Asahi (or some other linux distro) will probably be the best way to keep an M1 Mac on an up to date and secure operating system.


So it works great except half the hardware doesn't work and it's entirely unsupported by Apple, to whom you paid a significant premium for the hardware?


It's a work in progress, users are generally confident that the remaining hardware will gain Asahi support sooner or later.

The fact that Asahi is such a popular project is a pretty strong indicator of how much room for improvement MacOS has, to put it as politely as I can. Personally, I wouldn't even consider buying a new Mac if there wasn't any good alternate native OS available.


That tends to be how Linux works outside of standard platforms, yes. It takes time for developers to write drivers.


It works great on my custom built PC, and my store bought laptop. Both are perfectly normal pieces of hardware, not apple's weird stuff.


Still, how much they pay their developers a year? Would be over $150k I imagine, drop him $10k which he deserves and a sincere apology and we never would have seen this article, this would definitely cause a larger loss for them.


Isn't gumroad still a one man show? They publish their business numbers on Twitter afaik


Yes, here they are: https://twitter.com/shl/status/1481349152621559811?lang=en

In 2021, @Gumroad achieved: • $185.5m in creator earnings, up 30% • $10.9m in revenue, up 18%

Surely, with $10.9m revenue, a little more than $500 would've been okay to hand out for this.


10k and a hoodie or something would suffice


Is it? They have a developer who streams constantly on twitch. Seems like they have multiple people working there.


Here's an article by Gumroad's founder that covers how many employees Gumroad has: https://sahillavingia.com/work

25 people work at Gumroad, but none full time.


Regular engineers are paid 100k, senior engineer 150k

It's on their site


bruh...


I still really like Python for scripts and small programs, but I’m always going to use Rust now for CLI’s or anything larger, I get the problem you have with it, but the trade off is worth it for me being able refactor quickly and with confidence as the program grows.


Fair point. I dearly miss Rust's type system.

I think python vs rust is always going to be a tradeoff either way. I just wish we had the best of both worlds in one language. Rust's base language with python's extensive library. Perhaps a new language will emerge that can satisfy both requirement now that Rust has a somewhat stable "safe" FFI. One can only wish.


It’s funny when someone is parroting the doom and gloom they hear from mainstream media, and you point out something like how world hunger and poverty is at an all time low, the standard of living has never been higher etc. most of the time they’ll just continue justifying their position, its only engineers that I’ve heard say “oh yeah..”


It's a bit more nuanced than that. The standard of living for the bottom half globally has substantially improved over the past 100 years. The standard of living for the bottom half of the developed world has arguably declined in the past few decades. In the last 5 years or so, due to inflation and Covid, more people have fallen into extreme poverty.


I agree with you, but it is important to keep long-term trends in mind. If we track from 1700 to now, things go vertical in the late 1800s, and then only accelerate from 1990 forward. Periodic dips are somewhat expected due to inflationary expansions of credit and subsequent retractions in credit as people and organizations default. Likewise, wars tend to happen which cause periodic drops in theater.


The elephant curve[0] is an interesting quantification of global income changes in the modern era. Notably it represents global incomes so presumably the United States occupies the higher end of this range.

[0] https://en.wikipedia.org/wiki/The_Elephant_Curve


> The standard of living for the bottom half of the developed world has arguably declined in the past few decades. In the last 5 years or so, due to inflation and Covid, more people have fallen into extreme poverty.

What’s the source for this? It was my understanding (e.g. from the poverty line) that there’s been significant improvements also in the last few decades.


If you look up real wages in the USA it's been flat or negative for everyone but the top 4% for 40 years.


We were talking about the WORLD. Not the US. The WORLD has seen enormous improvements. Much larger than the backslide the US has experienced.


The source for this is on the street man.


I laughed, but also The World Bank


"The standard of living for the bottom half of the developed world has arguably declined in the past few decades."

I am willing to hear this argument, but intuitively it seems like this is not the case. Does a poor person in Arkansas have a lower quality of life now than they had in the 1980's? I feel not, but I'm not sure how to measure or validate this. Income numbers aren't great, because often they don't account for various assistance programs and subsidies (not that assistance programs are a satisfying end-state).

I'd love to look at things like

- whether that have indoor plumbing (I would have assumed this was near 100% even in 1980, but I don't know).

- Ownership of microwaves, TVs, cars, etc.

- Hours spent working

- Usage of things that have made all our lives better, like vaccines and other medicine, internet, education, etc.


It's debatable, of course. Some points I would add:

- Prices of healthcare, education and housing have increased in real terms

- Double income trap

- Shift from well paying union jobs to gig economy, temp agencies, Amazon warehouse etc. leading to precarious employment

- Opiod crisis


I don’t know about the hypothetical average person from Arkansas but I’m sure the average Israeli, South Korean, Taiwanese, Czech and many others are immensely better off today than in 1980.

And in 40 years the same will be true for the average Indian, Vietnamese, Bangladeshi and the other fast growing developing economies of today (unless climate change wrecks havoc in the tropics by then).

We’re seeing the highest growth in the countries that aren’t already at the top.


The problem with the progress that we're witnessing is that it's not sustainable. It's made possible by depleting non-renewable resources and messing up the environment. According to the best models at our disposal, our living conditions are going to decline in the coming decades. So maybe this is why it is precisely engineers who are sceptical since they have better understanding of the scientific consensus on the matter.


Well... maybe. Partly for sure. But we also know that the GPD-to-CO2-emissions coupling that used to be super strong has been thoroughly broken. So first order yes, second order? maybe. Third order? Almost certainly not.

That being said, we need to do WAY more obviously.


Better to do that than parroting anything they hear from the fringe media. Their takes on the world are even more gloomy.


> its only engineers that I’ve heard say “oh yeah..”

Yes - only engineers are capable of independent and rational thought. The orange site continues to sniff its own farts.


Not that I really disagree with you but you could have said that more civilly.


Yes, I should be more civil in my comment and thank you for pointing it out.

I find the tone of "engineers, developers smart, everyone else dumb" to be really tiresome; and when it becomes self congratulatory, it becomes really irksome. It's also the casual dismissal of knowledge in other fields and lived experiences of others that just happen again and again here.

I should probably find a different place to hang.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: