Hacker News new | past | comments | ask | show | jobs | submit | kgen's comments login

To be honest, it's probably not enough to just block these scrapers if they are acting maliciously, people should just start serving generated content back to it and see how long it takes for them to catch on and fix the problem


I was just about to post the same thing -- quite a fascinating test of gpt's capabilities


But lots of airlines have come and gone or merged over the years and these systems would have already had to deal with new airlines codes?


Probably true. But their own systems internally almost certainly have those magic letters hard coded in thousands of places.


100% facts. They do.

In prior roles, I've worked at 2 of the largest US airlines that have gone through mergers. There are lots of hard-coded letters in decades-old code that help identify what's mainline, regional, and OA (other airline). We're talking everything from reservations (booking) to revenue recognition (flight departure).

It would be no simple feat to upgrade the tech stack. Hell, some of the mergers are still lingering within the systems because airlines wanted to complete it quickly for the passengers. The backend, however, has band-aids all over the place.


I was thinking about that very thing as well, but I came to the conclusion that robotic movement doesn't really need to match human movements. If you want to have consistent fine motor control, you wouldn't really expect things like acceleration when hitting with a hammer, or pulling down the top of the griddle.


I don't think it's intentional but I'm definitely giving him a pass since even AAA movies get this stuff wrong.


There's PBS, but yeah if it had the popularity of BBC it would probably look a lot different.


Right, and what the person above is saying is that spyware/bad extensions can be making random searches which muddy your profile's search data


It is? That's not how I would normally interpret "interfering".

And why should random searches make my results significantly worse than the default? That doesn't exactly absolve google of screwing up.


I imagine the GPU also draws lower power during idle, so they would both scale at undefined rates when benchmarking


I've definitely bought my fair share of non-framed art on Etsy, so that is one option.


This is so interesting, I how you debug something like this (assuming there is no copy of the original pre-processed image as well)?


This is a good question, and one that I've been thinking a lot about as well. I think the problem is that

A) there's a LOT more people using computers now than ever before, and

B) the entry point into "computing" has lowered, which means that users come to a piece of software with a wide range of expectations of both what it should do, and how it should do it, and

C) the complexity matrix of the form factors that people expect to use software has also grown in size, and

D) software grows in complexity in response to the needs of the users (B) and the devices it runs on (C)

E) tack onto the above is that you really have to handle N+10 things nowadays to have a proper piece of software (ie. i10n, security, UX, performance, etc.)

It's a bit of a cop out to just blame complexity, but it's almost impossible to have a small team write a (sufficiently complicated) program that does all the above, but companies want single-software solutions (runs on everything for everyone from new users to veterans) and users also expect it. So as a result, developers are overworked, users complain the software doesn't do what they expect (often for good reasons), and devs often end up resenting users because despite their best intentions, it's simply not possible (a lot of the time) to make something that satisfies everyone.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: