You make it sound like google could just flip a switch and the assistant would magically work better. If it were that simple, why would they intentionally limit the platform?
So what is novel or unique here? It’s a half finished demo/game of derived assets. This is like the stuff I made as a beginner.
I thought based on the name that it was some clever hack using only HTML and CSS (with maybe just a little cheating for input). But it looks like just a regular old js demo.
Edit: A search of github will show hundreds of similar projects, most of them for undergrad assignments. Maybe I am a snobbish dick but I expected something more notable than run of the mill undergrad projects when coming here.
It also looks to me like something made by a beginner. But from that impression I have the opposite take: why write this comment at all? Your comment only will serve to discourage this person, who is just getting started and not sure if they should continue down this path.
Does this project belong on HN? Probably not. Does your comment contribute anything? Definitely not.
> Your comment only will serve to discourage this person, who is just getting started and not sure if they should continue down this path.
I don't know about that, if nobody told me that my kid projects sucked I may not have tried to make them better and would have stayed content with their sorry state and would just have doubled down on them without ever seeing what's wrong... and then I would make everyone cringe around me like 30 years-old spouting cringy RPG Maker games :-)
The original comment is only discouragement, not critique. There’s a huge difference! Some people work harder when simply discouraged; it sounds like you’re that type of person. Not everyone is like that.
How is writing this comment helpful to your pursuit of finding something novel or unique? Next time simply click the link above or below it, or better yet make something novel/unique yourself.
Some group of people are guaranteed to become brain surgeons.. the demand is there, the training is there, so it is a near certainty. In fact in the US at least, you can know very closely how many brain surgeons there can be per year. Who will fill those slots isn’t too kind to all takers. It is dedication, hard work, and luck (ie ovarian lottery).
Also the majority of those that “fail” at obtaining a neurosurgery training. If they were on the path to medical training will end up falling back to any number of high paying jobs in medicine or surgery.
This study found no one did terribly well at day trading. Random chance implies there will be some outlying outcomes, but that is a fundamentally different objective than having too many people for few slots.
> Before WeWork, most people weren't aware that coworking spaces were an option
Is this true? I am skeptical. If it is true it speaks to the mindboggling ignorance of Silicon Valley decision makers.
I was living in a small (<500k metro not in california in 2009, and there were already 2 branded “coworking” businesses (that were well marketed to the tiny tech community). I may be the exception but I had not heard of WeWork until much later, but I wasn’t in a talent resource decision making position at the time, it wasn’t really my need to know those things. If it was and I didn’t, I would have fired myself for gross incompetence.
Regus may be huge, but they're generally not listed in any of the coworking space directories I've been checking.
I know they're in my city, but the website location quotes daily prices based on a 24-month commitment.
Either they don't feel threatened by WeWork and independent coworking spaces, or they don't feel the need to adjust their terms and contract conditions... which I find curious.
This is bullshit. Why would any interview process filter out the factors that lead to screwing around, working on other things, or just burnout? They won’t and they can’t, as these are dynamic factors. Also I’m curious as to why you think a white board interview of all things is the deciding factor in at even minimizing this behavior.
A plausible reason why these behaviors are less common at google and Facebook etc is that they are paying much closer to “efficiency wages”.. if you’re working at the typical tech or Corp company you’re probably making shit pay, and why wouldn’t you study to get something better..
Google pays much better than average and is already top resume signaling, there’s just much less incentive to screw around.
Googles process is still not fantastic at filtering for general critical thinking.
Irony: the person referred to in the story may very well ended up at Google.
I know this possibility first hand. I worked for a shit shop nearby a Google office. My own fucking manager did whiteboard practice and ended up at google a few months later.
It uses an LCD, a lowres super twist shit one and it uses a CCFL backlight which eats battery. So replacing it with a different active matrix LCD with LED backlight can improve battery life and quality (double res)
> python 2 to 3 (at least by 3.3 or so) was one of the easiest transitions I've ever done.
Good for you. Some of us had code bases of considerable size and complexity though.
The fact is that for working software on the python platform, this upgrade represented work that had to be done that for a legacy app that was still chugging... little benefit. If you already coded around the python 2 limitations for Unicode eg, then python 3 was not a help at that point for something already in service. It was just more cost for no benefit.
> If people spent half as much energy upgrading as complaining, this would have gotten done 5 years ago.
These aren’t exchangeable. Why do people on the internet think bitching (either constructive or not) is some valuable currency? Often times the people complaining had no means or position to of the work. 4 promotions later I sure as hell wasn’t going to participate in that 2 to 3 mess on a project produced years ago, but I could still opine in the situation. I also had little incentive to fund it.
Your analogies are bizarre. No idea what you’re trying to convey with philips v torx but I’m going to wager it’s explanatory power in this case is shit anyway. I can still demolish it, having actually worked in manufacturing there were times we told a supplier (ie Python) to fuck off and piss up a rope because what they were proposing was not compatible with our existing tooling and it would be too costly to convert for little benefit to us.
I respect apples prowess in the consumer space, but there’s a reason you don’t see their products regularly put into industrial roles where your timeline is more than 5 years. Apple products are disposable, many applications in industry are expected to last. Python is a general purpose programming language (or at least billed itself as such). Your comparison is poor.
I found some Unicode bugs in porting—and I get continued free security updates. Seems fair to me, and certainly not “no benefit”. I can keep on 2.7 as long as I like; nobody’s forcing me to port. Compare the situation with Java!
I like Python. But it loses in the upgrade comparison to Java. Existing Java code almost always just keeps working with new compilers and JVMs. And new JVMs tend to increase performance of your code for relatively little upgrade effort.
Isn’t that comparable to java? Oracle gives you paid security updates on version 8. You can keep using the previous version without security updates or migrate to OpenJDK 8, which have their own security group. More options, seems better to me.
>considering timelines in engineering are to support a version for 60 years.
In the vast minority of cases.
Not in most electrical engineering (except power plants), nor in computer engineering. 60 years ago was 1959. What software/computer project from that time is still running?
Even moving outside of the electrical domain, how many physical products outside of civil engineering is expected to last that long. I certainly can't expect support for my car for longer than 25 years.
If there's a difference it's not time scale necessarily, it's that CPAs are better at amortizing support/maintenance costs across decades. CPAs have amazingly detailed depreciation charts for non-software engineering lifecycles. If you build a civil engineering project, you figure out asset depreciation versus maintenance schedules, and you budget accordingly.
I've yet to see a CPA adequately depreciate software assets, anywhere. Maybe we should help them out by building better depreciation schedules. There seems to be a lot of CPAs that don't believe software depreciates over time, and maybe that's the largest disconnect in labeling it "tech debt", because accountants hear that term and think they can ignore it on a balance sheet they don't care about, but "tech depreciation" might actually scare them straight (until they find the tax advantages in calling it that).
In a previous job that was mostly in an engineering/manufacturing department, but with a lot of Perl/Python automation scripts, we had an internal conference. One of the keynote talks was when not to automate using SW. It went into the cost of maintaining SW over the long term - including the fact that authors leave, and people who understand their scripts require higher salaries. Most people who write these scripts are not hired for SW roles, so their replacements likely cannot debug/extend.
Classic cars can still get parts after 25 years, but a lot of times its custom machine shop stuff. There is a market for it. Really there's also a market for restoring old computers (look at The 8-Bit Guy, LGR, The Living Computer Museum (Seattle) and all those other people who restore old hardware for fun and education.
But I get your point, those are special cases for preservation. For mainline things, especially with today's processes for continual integration, dependency checking and advance build tooling, dependency rot is something that should be accounted for in all software project plannings. If your dependencies are a few months out of date and you don't have the time to update them and re-run tests (people write tests right?) things are just going to hurt more and more later.
IBM does quite a good job of making sure that COBOL is supported long term with all needed updates for many many years to come. Python 2 is not in that position.
You mean, IBM does quite a good job of making sure IBM COBOL is supported long term. They are maintaining their compiler, which is exactly what PSF is doing. They are maintaining their interpreter, which is Python 3.
The GP post missed the fundamental difference between keeping COBOL running and keeping Python 2 running. Python 2 was also PSF's interpreter. IBM handling a COBOL upgrade like PSF handling the 2-3 transition would be unacceptable.
These are two very different kinds of institutions!
People who need COBOL support from IBM are paying a lot of money. Giant piles of money can get you many kinds of help that people won't volunteer to do for free... among them, maintaining ancient software in amber.
If you need Python 2 support and you are willing (and able) to pay the kind of money that IBM's customers pay for COBOL support, you'll be OK. For a start, Red Hat (aka also IBM!) shipped Python 2 in RHEL 8, which means they'll be supporting it until 2029 at the earliest.
To circle back to the original point, no COBOL committee would break commonly running COBOL programs like the Python 2-3 transition. Using COBOL in an example with Python is just wrong. Maybe the break is justified, maybe it isn't, but some languages do a lot of work to make sure things continue to work.
Would PSF exponentially increasing support and maintenance costs for Python 2 into multimillion dollar contracts and bundling over-margined hardware in with the bundle to make it more of an IBM-like transition help?
Yep, someone has to pay in one way or another like Red Hat customers on 7, but to say Python has near the life cycle of COBOL is just disingenuous. Old COBOL still runs, but Python 2 programs will not. It really shows what the achievement languages like COBOL, RPG, and Fortran are in terms of longevity and migration.
Er, why not? It's not like there's some kill switch in Python 2 that will make it stop working after January 1st, 2020. If it works now, then it'll still work, you're just not guaranteed fixes anymore. At least, not for free. As stated in the article, paid support options exist from several vendors.
Right, but in engineering it's kind of expected to get support for a version for at least 60 years. Software engineering is just really weird in that it moves so fast and nobody seems to care to break things.
We sell welding systems to weld stainless steal, copper pipes, etc. We always give a warranty of 24 month, guarantee paid support for 10 years, and support older machines only if possible. I am not sure which industry you are talking about, but 60 years is the exception in my experience.
I think this is where paid support comes into picture. Volenteers can keep improving Python, business that cant/wont upgrade can pay someone to "handle it", and consultants can make money. Everyone is happy.
You have a dependency rot problem. Are you missing unit tests? Because having a ton of unit tests can reduce dependency rot, breakage and overall make engineering upgrades just a lot easier to deal with. They don't catch everything of course, but they can catch a lot.
If you haven't put in the priority to update your Py2 to Py3 apps by now, I really think your shop has the wrong priorities.
It's not just about Py2/3. Dependency rot is one of the worst form of technical debt. It often shows broken CI, broken security scanning, lots of generally broken processes that will just keep hurting a team further and further down the line.
I don't really agree with much of what the person you replied to said, but certain domains have had their hands tied and I also can't imagine them working as you describe.
CG/Visual Effects industry is still firmly using Python 2. Only in 2020 are they taking the first step to transition to Py3 [1]. Users and studios are held back because the Python runtime is used inside major applications; Nuke, Houdini, Maya as well as libraries and APIs. None of them have released a version that runs Python 3 yet.
The reasons for delaying it (mentioned in a footnote on that page) makes sense to me. Previous years were focusing on coordinating updates to GCC, Boost, C++14, Qt, and waiting on Python bindings for Qt.
Also, I've worked at a couple studios many people have probably heard of and none of them have unit tests covering much of their code. The focus is on tools that facilitate in-house artists where responsiveness to needs are valued over architecture and completeness. Requirements change for each project and previous requirements are often sacrificed (until a new project needs them in a few years).
I'm itching to move to Python3, but even for standalone tools I've felt it better to choose a completely different language (or Python2) instead trying to mix Python2 and 3 because having them co-exist creates more headaches in managing the environments, dependencies, and coding styles.
My overriding motto as an engineer is "If it ain't broke, don't fix it." My python 2.7 component has run flawlessly for years now, with virtually no need to update. I just haven't had to worry about it. We decided to update to py3 a few weeks ago, and this component is now having sporadic hiccups. Like, it runs fine 95% of the day, then all of a sudden a timer callback in a Tornado ioloop just stops running for 30 seconds after working fine for 10 hours or something. This disables the entire trading system. It's hard to explain to anyone, myself included, what I gained by updating a perfectly working system to something that now shuts production a few times a day. Now my time is diverted from doing actually useful tasks to fixing code that wasn't broken to begin with.
I didn’t think it was the most amazing comment either, however, it was intended more to illustrate how things are rather than how they ought to be, I think some interpreted as a strong opinion in favor of the circumstances which was not intended. Based on how it scored (somewhat surprisingly) it clearly resonated with more than a few.
You've managed to say their analogy is bad in two paragraphs without even explaining why it's a bad analogy. You even admit you don't really understand what they were trying to say. Poor explanation, or poor understanding?
If an analogy requires more explanation than the original concept what purpose does it serve?
I think in this case at best the analogy grossly oversimplifies the issue. If it really were just a “screwdrivers” problem as explained then the 2 to 3 migration would have been mostly trouble free and would have happened. Clearly it did not go that way so that analogy can not possibly be appropriate.
Frankly there’s no saving the people concerned about quantization issues with digital vs vinyl. They clearly don’t understand the basic physics and that mechanical of PVC severely limits dynamic range such that with about 10 bits of signal and some noise you can quite comfortably transparently recreate vinyl. 16 bits is more than enough to record vinyl.
There’s no reasoning somebody out of something they weren’t reasoned into the first place, so usually this isn’t an argument worth having.
A modern laser interferometer could easily get well past the resolution that is limited by the vinyl structure itself.
The CED offers one hour of VHS quality per side that rapidly degenerates. I own a sizeable collection of them, and while sorta neat, it’s also sorta terrible.
The vinyl offers quality superior to most consumer tape equipment rivaled only by 11 ips reel to reel. CED had no quality advantages, decays rapidly and is non recordable. It was a non-starter.
The fact is that CED was in the works since the 50s when it may have had a short successful life, but RCA couldn’t get off their ass. In fact CED was the last consumer product ever released by RCA. (The RCA of today is just a badge on various crap after being divested by Thomson Consumer Electronics)
Also even though CED uses a stylus, the similarities end there. You can literally run your nail in a phono groove and hear the sound. It is a literal imprint of the sound wave. Video cannot be encoded in such a straightforward manner, NTSC and PAL are not trivially. Also the CED is not vibrating the stylus. Rather the stylus sits on ridges and the signal is depth encoded (therefore varying the capacitance, hence the name). It is much closer to a crappy laser disc in operation (which also encodes in analog.. not digital)
An old boss of mine worked on that beast! From the sounds of it, it barely worked in the lab. He told a story of getting so frustrated one day that he literally shoved his ‘scope off the back of the bench in frustration and walked out for the day. In his opintion, it is a good thing that it is dead and buried.
I have dismantled the players. Truly a marvel of electromechanical design. So many hand soldered and assembled parts. It is incredible they made it work mass produced. Too bad it was 20 years too late.
You make it sound like google could just flip a switch and the assistant would magically work better. If it were that simple, why would they intentionally limit the platform?