Hacker News new | past | comments | ask | show | jobs | submit | Cakez0r's comments login

Me too. Hopefully someone working for the company sees this and has some sway to remove the requirement.


It's not a premature optimisation to use a hashset instead of a list though!


The bug is more devious than that. The code looks linear at a glance and the culprit is that sscanf is actually O(N) on the length of the string. How many people would expect that?


I also sent a physical letter to Lucasarts when I got stuck as a kid. They kindly sent me back a full walkthrough! I wonder how many other kids must've done this...


I mailed Nintendo for a dragon quest puzzle and they wrote back. Was great.


Why doesn't Uber just let Californian drivers set their own rates? Would it really be such a huge deal to their business model if drivers could set their own price per mile, or similar? Uber still takes a percentage cut and I assume the free market cost would work out to be in the same ballpark as Uber's pricing model.


No, but they might drive a couple of hours a day for some extra pocket money on top of their UBI


Would the contractor be an employee of the contracting agency in this scenario?


Countries are competing in a market for tax revenue too. If a country wants more tax revenue from a business, they can either close the legal loopholes that allow the business to reduce the amount of tax that they pay or make their tax laws more competitive with those of other countries.


OMG, this is golden. It's a race to the bottom. It ends up completely destroying states and replacing them with corporations.


I like your comment, not sure if you are joking though. A few decades ago when I started reading William Gibson’s cyber punk sci-fi, I often wondered if the corporate run enclaves in his stories would anticipate real events.


Neal Stephenson - "Snow Crash". Even the U.S. Government is reduced to a "franchulate".


If the police are defunded, I'd bet that private security companies would fill the void and we'd be well on our way to "Stephensonion" being the next "Orwellian"!


It's possible


I think it's a bit more like going on to a shop and trying to open all the doors, cupboards and drawers to see which ones are locked ;)


Isn't that the wrong analogy?

In this case, eBay is the shop, and I'm the customer. It's like walking into eBay and when I walk in I have to empty out all of my pockets and open my phone screen to show them that no one is telling me what to shop for (VNC).


No, because of the existence of client-side scripting with javascript, it's actually eBay that's running on your computer acting as the customer toward the shop that's your computer. You're right that the end effect is similar to having to empty out your pockets, but the underlying issue of why they're able to do that is a whole 'nother can of worms.


That's a bad analogy. It wrong because you can see what doors, cupboards and drawers are available for the public. Doors that are in-reach but that shouldn't be used by the public have signs like "restricted access" or "employees only". You can't do that with the internet. You can't see that a port is not available to you until you try it.

If you want to continue using that analogy, then you have to consider that everybody is blind and deaf, and checking to see what's locked is the only way to know if something is available.


> That's a bad analogy. It wrong because you can see what doors, cupboards and drawers are available for the public. Doors that are in-reach but that shouldn't be used by the public have signs like "restricted access" or "employees only". You can't do that with the internet. You can't see that a port is not available to you until you try it.

But you can see what ports/doors are available. TCP doors are defined in the RFC and they are numbered 0-65535. Those are the ones available.

Port scanning still is analogous to trying all these doors and see which one are open.

Just because it is a lot of doors to choose from doesn't make it very different. That's why guests ask a host where the bathroom is.

When you visit a website, it's not very cool for that site to check which of all your TCP ports are open. It's none of their business.


Hmm, then how about going to the changing room area and trying every door instead of waiting for the guy to tell you which one to go to?


I made this edit to the post you replied to. You probably missed it:

> If you want to continue using that analogy, then you have to consider that everybody is blind and deaf, and checking to see what's locked is the only way to know if something is available.

About this:

> instead of waiting for the guy to tell you which one to go to?

How does that translate to TCP/IP? What is "the guy" representing? The way I see it, there is no guy.


The guy is you installing Steam to run on port 27036.


I was really blown away by the results you achieved. Amazing work! My jaw hit the floor when I saw the witty farewell "fun guy" quip, and I was in stitches when I read the song about baking. I look forward to the day I can take the model for a spin - unfortunately I don't have the requisite $18,000 hardware ;)

I have a few questions: Could this be used as a tool to get a feel for public sentiment? For example, could you ask the bot what it thinks about gun control and have it spit out a policy that appeals to the common public? If you ask the bot what it thinks about how a company will perform, how accurately does it predict? I know that the model will contain the biases of the data set, but I'm curious if you've run these types of experiments. What do you think the results would be if you had an even bigger, more diverse corpus? (devil's advocate, for the sake of discussion: perhaps everyone's fb messenger and WhatsApp chat history)

Finally, you have clearly gone to great lengths to make the bot pleasant to interact with. What sort of results to you get when you train such a huge model on an uncurated corpus and don't try to tweak its personality? I find myself wishing that you didn't try to do this as the bot seems to be hyper-agreeable. I. E Too many responses like "You like watching paint dry? That's super interesting! I love watching paint dry!".


I would not encourage using the model for anything other than AI research -- we're still in the early days of dialogue, and there are a lot of unexplored avenues. There are still nuances around safety, controlling generation, consistency, and knowledge involvement. For instance, the bot cannot remember what you said even a few turns ago, due to limitations in memory size.

In the paper, we did explore what happens when you do NOT fine tune it on the specialized tasks (knowledge, empathy and personality). The non-finetuned bot was both less engaging and more toxic. The special finetuning is really important to getting this bot to be as high quality as it is.


But toxicity and quality is subjective. The technical achievement is undeniably brilliant, but the quality of the personality is subject to opinion - as I mentioned, I did not personally enjoy the agreeability of the bot. What's toxic today may not be toxic tomorrow and vice versa.

It's just a matter of time before a model of this size can be run on commodity hardware and somebody will take the brakes off and/or attempt to run experiments that aren't just "can this thing pass the turing test?". I'd be really interested to know the thoughts of the team, given their expert knowledge and experience with the matter.


For a more toxic version of a similar kind of bot, check out SubSimulatorGPT2: https://www.reddit.com/r/SubSimulatorGPT2/top/?sort=top&t=al...

Unfortunately you can't talk to it. (I've wanted to retrain a version that you can interact with dynamically, someday.)


The comment thread on the self-awareness post is both very convincing and really meta. I love how clear the training of the different bots is.


Was the bot nonsensical without the fine tuning, or just subjectively a worse conversational partner?


I think the GPU price you’re quoting is about 10k too high. $8500 for the V100/32gb


> The 9.4B parameter model requires at least two 32gb V100 GPUs to interact with

plus motherboard, cpu, ram etc.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: