Hacker News new | past | comments | ask | show | jobs | submit | jebediah's comments login

They are talking about PhDs in Machine Learning of course


Lots of universities have Data Science Master's Programs, which may ultimately fall under the Computer Science, Mathematics or Stats department. So, it depends on the university.


Data Science degrees seem to be such a hodge-podge of topics, with very uneven quality between programs. For long-term benefit, I'll admit I'm more inclined to tackle a traditional MS like CS, math, or stats.


I am inclined to agree with you, but then I remember that people used to say the same thing about chess. Perhaps completely solving language requires strong AI, but maybe we can get 99% there with something like the "chinese room", an AI that works like a well learned parrot.


Being paid less for a more fulfilling/meaningful job IS fair.


Fair point. The market is working properly.

The problem is mostly academics selling lies about that market to impressionable young students. They're supposed to be mentors looking out for their students' best interests, but are actually just pushing up the labor supply and pushing down prices.

forgotpwtomain's comment is highly indicative of the way academics do that. "Ignore the reality of terrible job options in academia. Industry is dirty and being a penniless researcher is the only noble path through life."


I don't know about you but I'm quite grateful that people like Donald Knuth stayed in the apparently terrible place that academia is rather then becoming senior managers at IBM for 500k+ a year.

I don't think that not having a huge salary == lower quality of life, I think having a non-rewarding job does though.

> Industry is dirty and being a penniless researcher is the only noble path through life."

I never advocated this, while for some people in fact being penniless doesn't significantly impact quality of life; for a lot of talented people that want to have families it does in fact matter and it's a large loss for science if Academia cannot retain these people.


So am I. In fact, I wish more people were able to work on research. Treating a desire for a comfortable living as the problem rather than an objective to be fulfilled is what keeps us from getting more and better researchers.

Also keep in mind that his generation's options were much better. The academic job market was a lot friendlier back then.


If that were true, the most meaningless and soul destroying jobs would be paid the most.

They aren't.


it is a factor, but not the only factor. Besides, the world isn't completely fair, you know.


This is slower than the sieve right?


I think so, but the memory usage is drastically lower: Only O(x/ln(x)) vs. O(x), for getting all primes up to x.


While there are only about x/ln(x) primes, each one takes at least log(x) bits to represent, so there's no memory savings after all.

In fact, there's an increasing loss as we optimize the constant factor in the sieve's O(x).

To wit, Dijkstra's algorithm takes 776MB to store all 203280221 primes under 2^32 at 4 bytes per prime.

A simple version of Eratosthenes' sieve, using 1 bit per odd number, takes 2^31 bits, which is only 256MB.

A more streamlined sieve like the one on my webpage at

http://tromp.github.io/pearls.html

implicitly filters out multiples of 2, 3, and 5, leaving only 8 potential primes in every 30 consecutive integers, conveniently fitting in a byte, for a total of 137MB.

(for some reason, compiling with nonzero optimization gives a gcc 4.8.5 internal compiler error on my SUSE Linux box)


(edit: typo gone, thanks for the interesting write-up)


Indeed! Thanks a lot. Its corrected now.

... no wait: It's corrected now. ;)


If I understood correctly, they are not using any MCTS, I believe they are not doing any look ahead, only greedily selecting the best turn.


Correct - we are not using MCTS and the approach has nothing to do with AlphaGo.


THE ROAD TO WISDOM

The road to wisdom? -- Well, it's plain and simple to express: Err and err and err again but less and less and less.

~Piet Hein


Wow, as someone who is going to start doing Computer Science next year, I had already heard that most of programming is about thinking, but had no idea that is was to the point of 5 lines of code per day being exceptional.


Green-field software should be well in excess of 5 lines a day. If the software is old to the point that nobody fully understands it any more, and previous maintainers have left the company, the number of production lines of code added per day may easily be lower than 5.

Older code has a lot more functionality, so every line of code is capable of leveraging more. 5 lines of code in a mature system may do more than 500 lines in a new system. But a bigger factor is figuring out which 5 lines to write. A third of the time can easily be spent researching the code trying to figure out a plan of attack, and the remainder iteratively debugging different variations of 5 lines validating (or invalidating) assumptions made about the code's emergent design during the planning phase.

And I'm not counting test code. Test code is usually cheap to write, if the code is testable. Writing lots of tests is an easy way to inflate lines written, should you be stuck somewhere in the dark ages where they actually measure productivity by such a discredited yardstick.


Don't wait until next year. Start programming now. School is good to learn computer science; school is not needed to learn how to program.

Don't read anything into the 5 lines of code per day. This is a pretty old article and even then, that would only be a reasonable figure if you take a large number of people on a large project and compute an average for everyone. In other words, counting refactoring, fixing bugs, re-writing things, doing administrative work, and so on towards the time elapsed. And even then it would vary so much from one project to the next that you couldn't get a reasonable representative number, though 5 LOC per day would certainly be possible.


I have already started programming, it is just that all the problem sets I have solved were easy enough that I could get to coding right away, and when I couldn't I just thought I was dumb


Don't ever let yourself think you are dumb. It's a lot more harmful to your psychology and to learning than it may at frist appear.

If you can't figure something out, you are just lacking some knowledge you need. I wish I had learned that a long time ago.

Being dumb is never the explanation for not understanding or figuring out something.


In my experience, as an outrageously well paid programmer, I will touch a lot more than 5 lines a day whenever I am programming, especially if I include changes to the unit tests, integration tests and such. There are two caveats though:

First, you will not be coding every day. There's requirements gathering, setting up environments, build systems and such. Depending on what you are working on, this could be over 50% of your time.

And second, and perhaps more importantly, most of the lines you touch will replace lines that you, or someone else, has written in the past, instead of just adding more functionality without touching the existing codebase. For instance, 5 months ago I started working on a project that had been written by two junior programmers that did not understand the language they were using, functional programming, or the problem they were solving. The code was full of repetition, bugs, and there was no way to make heads or tails of what it was actually solving. So I started refactoring, working on eliminating duplication, and trying to build abstractions. After a week, I had made a good dozen commits every day, but they were on top of each other: total LOC actually went down. Only after that week of coding that didn't add to the codebase I could see the places where refactoring just wouldn't do anymore, and was able to figure out what to rewrite, and how to add the additional requirements.

So I had spent a month on a 15K LOC codebase, and ended up with 12K LOC that did more, had more tests, ran faster, and made sense to the people that would end up owning the product in the long term. But yes, I mucked with over 100 lines of code most days.

Situations like that happen to experienced programmers all the time.


When working on an existing system, I find it's a good sign if your SLOC/day count goes negative for a while.


One contract I had, I dropped the line count of an inherited codebase by 50%. Much of which was necessary to simply be able to run (and debug) it interactively.

It still ran. It ran correctly. It ran faster than previously.

I never quite grasped what it was doing.

The company folded some time after I left it.


That is wisdom...


don't be discouraged, it's par for the course.

we're talking about quality lines of code: debugged, qa'd, and working as expected. perhaps even after the customer has had a chance to peruse the beta. you will be surprised how much time you'll spend in meetings and how hard navigating and contributing to large projects becomes.

curiously, the number of lines of quality code expected by a developer has not changed much since "the mythical man month" was written, which was about IBM machines in the 60s, despite the obvious advances in programming languages and programming environments. it's still around 10 or so.


For me, lines of code per day varies depending on what is being done and how.

If it's a meaty problem that is not well understood, or a language or framework I'm unfamiliar with, or I sense the situation is complex enough that an early bad decision could lead to unforeseen troubles later (and I should tread carefully), very few or no lines are written during the day.

On the other hand, if it is similar to a thing I have done many times before, and I'm comfortable with the problem, the language and framework then there is much less time thinking and StackOverflowing and I can go at nearly typing speed. (Likewise, if the code is quick "one of" type things for tests or throwaway stuff). More than several hundred lines of code a day (JS or Python) in cases like these is very achievable.

This doesn't count things like html templates, css etc (although sometime that take a bit of thought as well).


Remember that you will spend more time debugging, documenting and testing your code. And this is if you work somewhere that doesn't expect you to spend 10+ hours a week in meetings.

[edit] And you will also write a lot of code you throw away; sometimes you need to solve 90% of the problem the wrong way to discover the right way.


No, this is just an extreme example of the "don't try to be a smart guy" ideology. In practice 100-500 LOC per day is normal.


100-500 LOC per day is normal.

It is not. You are not writing a novel here. Yes, most of the time is spent thinking. Some days there is no coding because it is spent on just trying to figure out what to do.


While emotions may run high, it's important to remember that this is an empirical dispute. It can be resolved simply by looking at everyone's git history. For myself, the least code I've written on any day in the past year is 20 lines. My average is around 100. The most is a tad over 1500. Of course everyone's different and LoC is a terrible measure and it depends on the language, task at hand, etc. But in general, most of my time is spent "honing" code; testing/fixing corner cases and working around issues in other software. "Other software" includes browsers, filesystems, databases, JITs, but mostly browsers.

I personally don't spend much time thinking about software-related problems. At the risk of sounding conceited, I'll admit that most problems I encounter are pretty straightforward. Five minutes of uninterrupted thought is more than enough to get into diminishing returns.

It would be very interesting if GitHub pulled an OKCupid and published some statistical analysis of programmer behavior. They could put many of these disagreements to rest.


I'm curious to hear about how many problems you solve with these lines of code. I guess that's pretty much impossible to quantify though. Also, what languages do you work in?

I'm not that experienced, but I had a notable experience working with a 5000+ loc app that was a nightmare to maintain and extend. The last guy had basically reinvented every wheel. He was even, in my opinion, reimplementing the DOM in places with absolute positioning. Also, the code was poorly organized, with pieces of logic that could have been easily consolidated appearing throughout the app in multiple places so that they could not be abstracted out.

After about a month of trudging through his code and making almost no progress, I basically told my manager that we had to rewrite it (I had been thinking this the whole time, but I didn't want to be the junior dev who comes in and demands to scrap everything).

Me and a colleague paired on it for about 2 weeks and rewrote the whole thing entirely from scratch, leveraging several open source libraries and writing a few tested "internal libraries". The whole thing ended up being around ~800 loc when we were done, and it had the extra features that needed to be added, and was pretty bug free.

I'm not trying to blow my own horn here, in other instances I have spent way too long pondering about code and making it way more concise than necessary.

But unless you are a truly great coder who is cranking out 100 lines of good code per day (and I have no doubt that you are), I would be pretty suspicious about the amount that you are writing. I would worry that you are placing a great maintenance load on those who come after you.


(Note: I don't mean for this comment to be perceived as bragging or showmanship. It's just that... well, you asked about my background.)

Computers have fascinated me since before I can remember. I spend most of my waking hours in front of them. I've been honing my craft for over 20 years. I've written and maintained projects in C, C++, C#, Java, JavaScript (both browser and server-side)[1], Perl, PHP (we all make mistakes), Python, Ruby, and a couple Lisps. In my travels, I've discovered and reported bugs in popular software such as Firefox, Chrome, Node.js, Apple's XNU kernel, and libxml2.

I completely agree that some programmers are like machine guns, firing off vast quantities of poorly-aimed code. I try my best to avoid that. I hate sloppy code. I hate repetitive code. Most of all, I hate re-inventing the wheel. If a decent library exists, I'll use it. I have no qualms with something Not Invented Here.

I pair sometimes. I do code reviews often. And I use as many profiling, testing, and static analysis tools as I can get my hands on.

It sounds like your ordeal made you a better coder. Those sorts of experiences are indispensable, but I've found it takes more to keep improving. It's very useful to become an expert on programming, not just an expert at programming. There's a growing body of literature to aid anyone interested. McConnell's Code Complete is still great. Michael Feathers has a book called Working Effectively with Legacy Code. It contains some great techniques for incrementally improving hard-to-maintain projects. Lastly, browsing It Will Never Work in Theory[2] is a good way to stumble into some academic papers that apply to your own work.

1. "JavaScript" is such a nebulous term these days, but I've worked on JS codebases using tools ranging from nothing (vanilla JS) to JQuery to Google Closure to React.

2. http://neverworkintheory.org/


These metrics such as featured in Mythical Man Month are usually based on the SLOC count at the end of the project, divided by the time, not how much code was committed each day. Your churn might be 100+ LOC per day, but hopefully that isn't all permanent additions to the codebase -- bug count seems to be proportional to LOC.


Firstly, any real working programmer has days where they write no code. You are writing documentation, meeting with users and sponsors to discuss new features and schedules, doing code-reviews and mentoring, merging branches, debugging race conditions, meeting vendors, chasing dependencies in other parts of the company, a million other things. That brings the average LOC/day down.

I can't imagine anyone sustaining 100 LOC/day over the long term unless they think that cranking out HTML templates or something counts as programming.


I realize that I was referring to number of LOC touched a day, not change to the total codebase size. I'm not sure what the first poster had in mind.

For reference, I count myself as writing about ~40 LOC a day for a greenfield project (which I spend about 50% of my time on), measured by total codebase size.

And no, it is not HTML templates, although it is a fairly verbose language. And I do all the things you describe in your first paragraph.

In my experience, a good programmer is almost always a fast programmer, and non-trivial programs usually require a lot of LOC to get the job done.


I have to disagree here. This is definitely possible, especially on more greenfield projects. Most of the engineers on my team produced that much sustained. And my average is closer to 400/day even though a big chunk of my time is spent doing things other than programming (e.g. tech leadership / management). LOC isn't a great metric though and can be easily gamed.


It depends a lot on what kind of code you're writing. Don't expect to write 500 LOC/day if your project is to write a device driver or an improved computer vision algorithm.


>>an improved computer vision algorithm

I do this for remote sensing, and I might write 50 loc on the main thing, and often a bunch of python to test if the idea works.


I am okay with the general concept(though I guess it would be a lot better to put a computer on a keyboard, or seriously, on a phone), but I think that claiming this is the future of computing is ridiculous, pretty much the only thing this seems to be good at is being able to procrastinate at work, and once people are aware that you can put a computer on a mouse, that goes away


Everyone with a Kickstarter thinks he's Steve Fucking Jobs. When Steve said things like "magical" and "revolutionary" every presentation, it was annoying as hell, and he came a lot closer to actually revolutionizing the world than Mousebox ever will.


yeah, why can't I plug my android phone into a large monitor and use it like a trackpad!


The Motorola Photon 4G had an entire Ubuntu-based GNU/Linux distro packed into the ROM image, and they sold a docking station for it that basically turned the phone into a full fledged computer when connected to an HDMI capable monitor. Its cousin, the Atrix, did the same thing using a laptop-style dock.

A company called Cybernet has been putting full computers in keyboards since at least the early 90s (think Commodore 64 but with x86 guts).

In other words, this Mouse-Box is evolutionary, not revolutionary. I like it, but I'm not blown away.


I owned one of those and I never knew that.


If you have something like an Nvidia Shield tablet, you can basically do this. Bonus, it has a stylus so you can use it as a drawing tablet.

There's also things like this https://play.google.com/store/apps/details?id=com.thingsstuf...


I am going to break OPs rule: anybody here has linux running on a macbook air? I really wanted to get a light laptop with a long battery and found no alternative as good as MBA


I do, running Ubuntu 14.04 on a 2013 model. It mostly works.

There was an issue with it waking up when the lid was closed that I fixed with a simple service that just runs "echo XHC1 > /proc/acpi/wakeup" at startup.

Also, the webcam isn't a USB webcam: it's a PCI device made by Broadcom that doesn't have any drivers. And the 128GB SSD is a bit small and 4GB RAM isn't enough, as I use VMs sometimes.

On the plus side, the hardware is really nice and the trackpad is great. Still, I might have to get something a little more high-powered for myself and give this to my wife :)


It's not an Air experience, but in the OP's rule-breaking department I can report happily that I've been running Linux on a MBPR15"/2014, and its a very nice setup indeed .. only issue so far has been the trackpad button-taps not quite working right for my hands, but okay .. external mouse works, but .. to be honest, I'm actually also able to mount my Linux partition in VMWare, and use OSX as the host environment; in such a configuration its also a very viable Linux development rig. If I ever need to do kernel work though, I praise the holy reFind lords, if I need to .. don't quite know where I'd be without that:

http://www.rodsbooks.com/refind/

I'd love for a Linux-lovin' hardware vendor to pry my Macbook away from my Apple-ate'd brain and give me a hardware platform that is 100% open source, and yet .. sexy as all hell. Alas, the sexy part is Apple's plaything, it seems. I literally do not understand why nobody else is making hardware in the same league, design-wise (I know there are technologically far superior systems; its all about the haptic experience here, ok?) ..


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: