Hacker News new | past | comments | ask | show | jobs | submit | theK's comments login

One more good reason to start strengthening the idea of global governance?


No


absolutely not. Wehret den Anfängen.


Sometimes I wonder whether ETFs that track top valuation will lead to some weird stickyness and overvaluation in say, S&P500.


I have the same thoughts. Eventually there will be a lot of money to be made breaking the s&p 500.


Can you explain the "breaking" trade? And why haven't we seen more written about it?


Think of when George Soros broke The Bank of England for an example of the type of trade.

There is a lot of demand for S&P 500 index , but that demand isn’t exactly tied to the fundamentals of the index, and the price isn’t tied to value of the underlying companies, it’s tied to demand of people looking to save money for retirement or a place to store a nest egg. This is an opportunity for price discovery to get things wrong and eventually the market should correct that.

https://www.investopedia.com/ask/answers/08/george-soros-ban...


I don’t get it. Soros was able to break the pound because the UK government was committed to maintaining an artificial price. That’s nothing like an ETF or mutual fund of the S&P 500, which probably has one of the most efficient (relative to the capital involved) market price discovery mechanisms in history.


Not sure what they mean specifically but you've probably seen a lot written about it, in terms of BRICS, the petrodollar, ARM in China, subsidies on electric cars, and so on.

Personally I try to avoid investing in the US for political reasons, besides the wishful expectation that the empire could fall within my lifetime and hence be a not so good investment.


Probably. The biggest blind spot internal auditors have is things that didn't leave a paper trail.

It is too common that such investigations don't even start because there is just one connecting piece of evidence missing.

Leave a paper trail people!


The article talks about how this one fungus found in the depths of the sea can break down PE (polyethylene) plastics. The biggest problem with combating ocean plastic is deployment of any solution. The seas are vast and trash, while a huge problem, is still relatively sparse within them.

It would be great to see if this fungus can be deployed on land at large enough scale to take care of, say, a whole regions PEs. That way we could get somewhere.


> We need her in the lab, not at fancy dinners.

I think it is naive to take such a stance. This is the thinking that leads to layers of management, institutional bloat, non experts with MBAs calling shots and ultimately loss of the opportunity to fund genius as it becomes all the blurier who really has the vision and capability and who is just good with filling out grant applications.


Fine article describing the weak points of GrahQL. I find it a bit poor though that the only recommended alternative is OpenAPI rest APIs.

I have no beef against doing REST, jsonRPC etc. Actually I consistently steer people that way. But the documentation format we chose as an industry to build these things with, Swagger, is just disappointing. Some times I think the industry would be at a totally different point had we gone with more powerful standards like API blueprint (or maybe raml).

Case in point, I'm consulting an org with roughly 1k engineers right now on improving their API ecosystem and what we are seeing is that the more OpenAPI tooling they use, the worse the DX gets...


> I find it a bit poor though that the only recommended alternative is OpenAPI rest APIs.

What are your recommendations? gRpc?


My rant, as evident after that sentence, is about the industry selecting to standardize on Swagger instead of emerging a more powerful/succinct/etc system.


Someone above recommended https://connectrpc.com/ , which looks quite promising to me. Maybe I'll get time to play with it today


> But the documentation format we chose as an industry to build these things with, Swagger

This right here is IMO the biggest advantage of a GraphQL system. What equivalent to GraphiQL is there for OpenAPI? With GraphQL my frontend devs can go shopping for data in one UI, with rich descriptions and strong typing and even try out the queries live.


You are summarizing the very reason for its existence and popularity: everything is about making the frontend dev’s job easier, at any cost. This is a common theme across the entire web-adjacent industry and has been responsible for plenty of misguided changes and choices.


They did address that point specifically, suggesting TypeSpec as a more concise analogue. Presumably converting between them isn't that hard. A more concise DSL could be acceptable. Presumably it can be converted automatically anyway.


11k for three renames? Doesn't this also trigger significantly higher amounts oft maintenance work for aircraft technicians to update aircraft databases on thousands of birds?


Isn't this just the typical case of changing DB hosts? You set up replication to new location and migrate apps to new location?


In theory (!), yes.

It's something you'd need to trial though, perhaps a few times even.


From what the migration doc says, it will only support pgdump migration. Meaning it will require downtime, or at least write downtime. Create backup using pgdump, restore the dump to new database. While making sure no changes to the old db before switch over.


The ability to replicate from a database service is an extremely important consideration when picking any DBaaS. If you cannot do a no-downtime migration out of a database provider, you are needlessly locking yourself into a vendor.


Ouch that will make migrating anything over a few hundred gigs almost impossible. Our postgres db is about 2TB and pgdump stopped being viable a long time ago.


> Our postgres db is about 2TB and pgdump stopped being viable a long time ago.

Due to the time it takes to export, or because it can't handle a database of that size at all?


I'd assume they mean time to export, while 2TB is a substantial database, it's not pushing postgres limits.


you can use pg_basebackup to kickstart a replication slave. You can then just just down old and use new with ~1sec downtime. If you need to upgrade postgres at same time, you can then cascade to another one slave on same "new" host for safety, and pg_upgrade on that one. This will result in a bit more downtime, but also upgrading postgres version.


How does this work? Is it a book every two days? Or three books on the weekend? How long is your commute and how do you keep off YouTube/Netflix in the evening?


When you get in the habit of reading, it's remarkable how quickly you can do it. It's quite easy to read a book in a day when you read every day.


Unfortunately, I got in the habit of reading before bed. I inadvertently trained myself that reading leads to sleep. Now when I read, I very quickly start to feel sleepy.

So my advice would be to get in the habit of reading, but be mindful of the details of the habit you are establishing.


If you read short books of low complexity, aye. We need to start talking page count. Aint nobody reading 3 Robin Hobb books a week all year long.


I disagree. Reading lecture (! not literature or study books) I read 15 books each weekend in my teens, reading all science fiction and fantasy books in the town's library. And no, not all where 1000 pages long. But they weren't all shorter then 300. And yes, I regurarly went back during the week to catch an extra book or two.

I realize I might be a relatively fast reader, and this fast reading I only did in English (while my mother tongue is Dutch), but do not underestimate how quick some people can read?

I suppose I can concur in your 'low complexity' argument: I did not read Tolkien in my teens, and probably would have skipped all the poetry while reading.

Edit: Clarification: 15 books each weekend should be read as ==>> I traded my 15 books for 15 new books each weekend, reading them during the weekend and when not in school


> 15 books each weekend should be read as ==>> I traded my 15 books for 15 new books each weekend, reading them during the weekend and when not in school

So you mean 15 books each week, not each weekend! I was surprised when I read that, but this makes more sense.


So you read about 780 books/year as a teen? I have doubts. Unless the average page count is under 100 maybe.


Again, this isn't a measuring contest but I will point out that you could read that much given you wanted to. Life is tradeoffs after all.


I commute for 20 hours a week on public transport, so a lot gets done there. I also read every night as a way to wind down before bed. Sometimes I will read continuously on a weekend if the book/series is engaging enough and can finish a couple of thousand pages that way.

Honestly, its not a target it is just what happens when you read a lot - you get through books.

As for YouTube or Netflix, I dont really watch that much and can't really understand how people spend that much time on them. Given that though, I read instead and lots of people dont understand that. Each to their own, happiness is different for everyone etc


> how do you keep off YouTube/Netflix in the evening?

not Having Netflix is a start. Realizing that most Netflix shows gets dumped mid-story after 1 or 2 seasons and will leave me without closure is another


But is the Internet really as centralized as Email has become? I was living under the impression that Internet backbone providers where more diversified than that.


With BGP, you are usually only peering with a couple entities unless you are yourself in the "big boys club". It's a relationship business, but you only need a relationship with the handful of engineers who are on the other side of your own links.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: