Hacker News new | past | comments | ask | show | jobs | submit login
HP Unveils Premium Chromebook: 3K Display, Intel Core M, 16 GB of RAM and USB-C (anandtech.com)
229 points by msh on May 3, 2016 | hide | past | favorite | 229 comments



Everytime I've got 3200x1800 laptop, I wanna go back to 1920x1080 or 1920x1200.. Why well, personally, it's not worth going more than that. There is VRAM being wasted, gpu, cpu cycles, etc. and then software is not perfect - some of it would scale well, some others won't... Bought for my son Windows 10 laptop with 3200x1800 - lots of Qt applications would not scale properly, the minecraft launcher too. Maybe it'll be better on linux, chrome, and surely it was solved problem on OSX, but I still don't see the point.

The big thing was moving from 1024x768 to 1920x1080, and 4K displays are great - but when they are big, not for laptop.


> Maybe it'll be better on linux, chrome, and surely it was solved problem on OSX, but I still don't see the point.

Most of my use of a laptop is for programming, which means a lot of time starring at text. The far crisper text display on Retina/UltraHD/4k/whatevermarketingterm monitor feels much nicer to read for hours on end, for my money. It's a lot more like reading print.

I feel like I'm squinting when reading jagged lower rez text on my 1920x1080 external monitor, in contrast

Diclaimer: I use OS X, so I don't run into the scaling downsides you see in other OSes, and those are significant and would probably turn me off of 4k displays.


Other operating systems tend to snap fonts to the pixel grid where as OS X does not do that. So OS X suffers more from low-res displays in general.


It's funny, though, now that Retina is a thing, how much superior OS X font rendering algorithm is. This was not very clear when fuzzy low res displays were the norm. Back then, I vastly preferred Microsoft 's ClearType approach. Now, it's not even a contest -- Windows does not represent fonts truthfully.


Its pretty clear that OS X has existed for what 15 years most of that time without retina screens it didn't retroactively become better because screen resolutions improved to make what was presumably an inferior algorithm in the prior context work better in the modern context.


But it did. OS X (and Mac OS before it) were optimized for fidelity with higher resolution laser prints. This necessarily caused more anti-aliasing artifacts then with pixel snapped text. It was actually less of a problem with CRT monitors where individual pixels blended better with each other.

This is not a shallow concern, pixel snapping can cause major issues like line breaks differing between print and display.


Interesting I would never have considered print publishing. That said.

Was this configurable and what percentage of users were concerned with producing print copy? In what eras?

I would but that these individuals were always a minority and the arrangement appears to be inferior for the majority of users for the majority of the time it was a factor.


This was a Steve Jobs obsession from his first calligraphy class at Reed College. Proper proportional fonts were built in with the original Macintosh - this wasn't something he was going to do a market analysis on to decide for him.

However historically this was less problematic because screens were fuzzier anyways. It was only with LCDs that the anti aliasing became obvious.


The Macintosh may have had a very small market in the 90's but one market that it did dominate was print.


Most programs are pretty horrible in 4k with win10 too.


Really? All of my dev tools support scaling.


I agree, I have had no problems (except for Eclipse, but I vowed to use Eclipse as little as possible well before I upgraded to win10+4k).

Intellij, Cygwin, Ubuntu VMs (DPI scaling worked well in vmware at least), VS Code, VS proper, sublime, idle, cmd, cmder, IDA, Tableau. I can't think of a single application (except for Eclipse) which I noticed not scaling well.


And probably none of these is Qt-based. (I'm not blaming Qt specifically, it's a great framework, simply showing why you didn't have the problems). And of course there are solutions, but I wish things worked a bit more smoothly:

RStudio on Windows (fixed now, and some workarounds) - https://support.rstudio.com/hc/en-us/community/posts/2065421...


I don't use my windows install for development so I can't comment for that. However, I'm seeing issues with Photoshop and a lot of other programs that end up very small, blurry or with their images all stretched up.


So Chrome & OSX probably solve this somehow, but all my chromebooks have crouton, and yes Windows'10 support for 4K displays is not there yet. Most of the apps work, but then there are some that don't (Electronic Arts's Origin client, which is written in Qt was not scaling properly, same for Minecraft Launcher, there are other examples too). Also in the "4K" mode I would often see screen tearing in some games (if they can't stick to 30 or 60fps), but that's more due to the 4x pixels that needs to be drawn. For that I've hard switched my son's laptop back to 1920x1080 - so it still looks kind of crispy, and things look well.


I think you're missing parent's point. The benefits of staring at text all day with >30yo eyes also applies to Linux. I use Arch and Fedora on high-dpi displays and I'd sooner give up programming than ever return to ~72dpi.

Parent isn't talking about most apps, they are referring to terminals and IDEs, for which use case high-dpi is absolutely critical, and outweighs the occasional inconvenience of poor scaling outside those apps.


But that's where I disagree, especially when comes to terminals. I'm so used to seeing the pixels, that it doesn't bother me, but even then hinted anti-aliasing helped quite a lot to solve this. What I'm really missing (and hence my heavily opinionated answer) is that there are lot of people that might just started coding on higher DPI monitors/devices, and to them going back would be terrible.

I started on Apple ][ :)


I'd hate to go back to the eye-strain inducing fuzzy low-res CRTs we had to tolerate back in the day. I can look at my rMBP screen all day without getting a headache.


> What I'm really missing (and hence my heavily opinionated answer) is that there are lot of people that might just started coding on higher DPI monitors/devices, and to them going back would be terrible.

It has nothing to do with that. I learned to program on a 72DPI sony trinitron screen in the 90s. UltraHD is strictly a less eye-straining experience, "what I grew up on" be damned.


I haven't seen anything approaching 72dpi since 1024x768 stopped being common, most modern displays are north of 100dpi. It's still far too low though. We've been held back far too long by the OS and app makers inability to adjust to a reasonable resolution.


If you're seeing screen tearing, specifically when you see horizontal movement, then odds are it's the graphics card and drivers... If you're using a beefier graphics chipset, it works better.

I usually set it to 1440p scaling mode, or 2x... tbh, I have terrible vision anyhow, so I find the ultra-high resolution displays on phones don't help me much.


I had noticed the one app that displays terribly on my Surface Pro 4 is the Quassel IRC Client. Also, written in Qt.


But even then i would get a 4K external monitor and save the GPU/CPU cycles and battery life on the laptop. Being hunched over a laptop all day is probably a lot worse than having a lower res display in the end, so if we are talking ergonomics, better get a external 4K screen.


> Being hunched over a laptop all day is probably a lot worse than having a lower res display in the end, so if we are talking ergonomics, better get a external 4K screen.

Weird dichotomy. I use both simultaneously. No reason to confine yourself to one screen when 2 will do. And sometimes I code on the go, too.

The hand-wringing over battery life/GPU cycles with a 4K display seems misguided -- I get ~10 hours of battery life on my Retina MacBook Pro. That's no worse than my previous non-Retina version, and the damn thing got thinner, not bulkier.

It doesn't even switch over from the cheapo integrated GPU to the Radeon unless it has to drive more than one 4K display simultaneously. The battery impact of driving the integrated display seems negligible and totally dominated by the backlight.


I think the issue is Windows laptops with those displays get much worse battery life than standard HD displays. Apple does a great job of optimizing their software for their hardware. That way you do not see those battery life issues on Macbooks compared to Windows laptops usually. Have you even run Windows on you Mac? Talk about a battery drain.


I've done the Windows-on-Mac thing, but I assume half the problem there is Apple's Windows drivers aren't particular good (do the fans still just run full blast all the time?).

I'd hope that a dedicated Windows machine would have better drivers, but that's probably putting too much faith in the vast majority of manufacturers.


I ran Windows 7 on a 13" MBP, and then the 13" MBA when it originally came out. Never experienced full-speed fans unless I was playing videos games on it, doing something computationally expensive for an extended amount of time, or sitting with it in my lap, on top of a blanket, depriving it of ambient airflow.


People probably generally have both. When you are in the office, use the big screen but lots of us do a considerable amount of work from client sites, on airplanes, etc...


The crisp display is truly great.

What sucks is that a dual-monitor setup of a low-res monitor and my hiDPI laptop monitor is not that great. I have to upscale my monitor a bit, which makes the monitor look even worse than it already is (1920x1080). The monitor has a lot of room, but I naturally want to leave most windows on the small hiDPI laptop screen.

HiDPI monitors cost waaaay too much for me to get one. So for now, I just have to wait until the price starts going down.


You can get a QNIX QX2710 IPS 1440p for ~$200 on eBay. Do pay attention to connectors - presumably with a laptop you want one of the versions with HDMI or DisplayPort inputs, not the DVI-only variants. It has a PWM backlight, other versions have DC backlights (Check out the Crossover 2795QHD) but you are stepping up closer to $300 at which point you are competing with options like a refurbished Acer XB270HU or XG270HU 144hz/gsync monitor.

28" TN 4K monitors are regularly going for ~$300 nowadays. Again, Crossover makes a few here, the 288K and 289K. If you want IPS, keep an eye on the Dell Outlet for a P2715Q. They tend to run a lot of 35-40% off coupons for monitors, you can get one for under $350 if the timing works out. You will need DP 1.2 to drive 4K@60hz - HDMI 2.0 doesn't have any significant market penetration yet.

The Koreans basically make all the panels that go into monitors, and they take the ones that failed QC for whatever reasons and sell them cheaply though channels like eBay. You may have to put up with a few dead pixels but you can get a great IPS 27" monitor for the price of an average whatever. Don't bother getting the "pixel perfect" warranty unless you are OK paying to mail it back to Korea for replacement.


I wouldn't really recommend the 1440p versions. I had one shortly, but I did not really found it comfortable to look at. If scaling is disabled text is too tiny, and scaling (probably 125-150%) does not work as good as with higher resolutions. I now have a 28" 4k screen for which a higher scaling factor can be applied and that looks much better (although still worse than a MBP with retina screen). I guess 24" or 27" with 4k would be also good.


Wow, this is great information. Thanks.


I've purchased several of these 1440p Crossover monitors and they're pretty good. It is worth paying the ~$20 more for the variants listed as "pixel perfect" (If you're buying from one of the handful of reputable sellers).

I also have a Yamakasi M280PU which is a 27", 4K panel.


I agree, I can't use 1920x1080 or lower. Plugging my MacBook into a TV is horrible for anything but watching TV from a distance.

I think 2560x1440 is probably the ideal resolution for the time being, looks good enough, no need for a graphics card, cheap monitors, etc.


I wouldn't make a judgement from a TV though. I go from a 1080p monitor to the TV and i can't read the text at all on the TV while i can on the monitor just fine. Even with the TV in 'PC mode' it still does unnecessary post-processing.


I use/recommend a ASUS PB278Q 27" at 2560x1440. It works quite well for coding.


Since I don't have retina experience beyond cellphones, my 22" 1920x1080 is tolerable - on Windows. On Linux, oh boy..


27" 1440p is 108 ppi, which is the current sweet-spot IMO.

24" 1080p is 92 ppi, 22" 1080p is 100 ppi. So you're a bit behind the curve in terms of ppi, but it's probably not that noticeable. The big thing you're missing is just that 22" isn't that big a screen.

For reference 27" 4K is 163 ppi, and many people find that's too much and need to use some scaling.


To piggy back on this comment, I'm sure I'm not the only one that finds high density displays less fatiguing. That in itself is worth even more than the performance trade offs.


Well for one Eclipse!!! doesn't scale and is unusable on 4K displays. Tons of other applications don't... things get goofy and it makes the whole system seem kinda amateurish... And I don't really see the difference with text... maybe you have very small fonts?


For the longest time I refused to wear glasses and learned to read 13x7 bitmap blurs. Even with glasses, I can't read any other font nearly as well, so I'm stuck with my low res screens.

What I really want is a low res screen as bright as the retina.


I feel the same way - though, it's all about DPI. A 1080p display at 22" is not very good, but a 1080p at 13" is very very nice.

I have a 13" retina Macbook Pro with god knows how many pixels, and I'm left with a choice between full HiDPI where everything is giant, or "scaled" mode where the GPU renders even more pixels and then scales it down. This is because the UI components in OS X are bitmaps, and there is a set of 2x HiDPI bitmaps, and a set of 1x regular ones that are too small to read on a retina display. I use the scaled mode where it renders everything as if I had a giant monitor and scales down the whole screen to a normal size. Since I have the 1st generation 13" retina, it understandably struggles at times with its Intel GPU. It does look great, though.

I recently got a Dell Chromebook 13" with a 1080p IPS display, put Debian on it over Chrome OS, and now it's all but replaced my Macbook for everyday use. Everything is a decent size, and there is no need for the OS to do any scaling or anything weird, so everything works nicely. I mainly got it so that I wouldn't have to lug such an expensive laptop on the subway, but I've really grown attached to it. The DPI is high enough that I can't really discern the pixels, IPS displays tend to look great, and I forgot how much I missed using Linux while I was on OS X. :)


How do you manage with 16gb?


Upgraded it to 256 Gb - it just has an M.2 slot inside it.


Linux Mint with HiDPI on looks awesome at 3200x1800, IMO way better than on 1920x10. YMMV

Also, even Windows 7 looks way better than Windows 10 when you enable 200% scaling in Intel drivers.


I recently switched to the 3000x2000 Surface Book. Worrying about capacity being "wasted" is nonsense - how would it be better to not have that capacity? Some software looks bad, but I'm sure that'll be fixed as time goes by. Games look great, text looks even better, particularly if you read a lot.


I also have a Surface Book. The display is stunning when viewing photos and I have no issues browsing the web but I find using IntelliJ a bit difficult at the default 200% scaling. It's all a bit small and adjusting the font sizes is not as easy as it sounds (you get big fonts but everything else remains small). I also tried increasing the OS scaling to 225% or 250% but that introduced other issues (like when watching fullscreen videos you get a border).


I found I adjusted quickly to how Eclipse looks on it FWIW. The icons come out undersized but everything else works correctly, and I've grown to like the result - it feels like more space for the actual code.


I'll take back my previous comment. I just now attempted to customize the two font settings in IntelliJ and wow... it looks absolutely amazing. I changed the editor font to Consolas and increased the size a tad.

When I first got my SB I remember attempting this but I encountered odd glitches like fonts overlapping in the Project view.


Cost of display scanout in terms of power? When nothing's moving, more pixels = higher bandwidth requirements, and the thing has to be clocked higher. (I guess it runs constantly. The new non-scanned LCDs will help I suppose.) You also need more RAM, and that needs refreshing. And any time anything is moving, there's 4x as many bytes being shuffled about, and they have to be shuffled about 4x as quickly, because the display area is the same.


I think at least part of the implied tradeoff is resolution vs. battery life. More pixels == more GPU computation == more power required.


AIUI the backlight dominates the power consumption, so more pixels at the same physical size makes very little difference.

But sure, that's a lot more concrete. So are you unhappy with the battery life on this device? Does the laptop the grandparent was talking about have worse battery life than comparable models? Certainly I'm very happy with the battery life of my Surface Book (which is a lot better than that of the laptop it replace), so to my mind it's not a problem in practice.


more pixels requires more light, even on the same size display... So a 4K display the same size as a 1080p display will require 2-3x the light behind the display... I don't recall the exact amount extra, but it's not insignificant.. let alone the processing power to push those extra pixels.


>more pixels requires more light

[citation needed]?


More pixels means more circuitry in front of the backlight blocking light.


But to what degree? Blocking 2% vs. 1% of the light probably isn't noticeable; blocking 50% vs. 25% probably is. This might also be something that improves over time as process improvements reduce the overhead.


Potentially a lot. The iPad 3 requires 2.5x more backlight power (7 watts) for the same brightness than the iPad 2.


Thanks for quantifying that - the difference is 4.2 watts, which is huge. I wonder if the LED backlighting is getting any more efficient?


3200x1800 is 282 dpi. That's about on par with high-end tablets, worse than good phones (440 dpi or so), and worse than jaggie-free laser-printing (600 dpi, although that's not strictly comparable).

Text on high-DPI laptop screens looks enormously better than on low-DPI ones. And whatever issues Windows has with it, ChromeOS scales everything great.


Well, if you believe Apple (and in this case I do), then it's less about just DPI, and more about the combination of DPI and common viewing distance. Phones are generally viewed at closer ranges than tables, which are themselves viewed at a closer range than laptops, which are also probably viewed at a closer range than workstation monitors.

1: https://en.wikipedia.org/wiki/Retina_Display#Technical_defin...


Sure, at work I have two 1920x1080 24" monitors, and used such configuration for several years (had exact same config at previous job). I like this better than the other offering of one huge 4K display (30" or was it 36" - I don't remmember).


24" 4K is a really good option though


I agree, but on OSX that is the same screen real estate as 1080p. I personally use 5K monitors so I can get 2560x1440


Y but with much sharper text :) 5K 27" is better obviously but 3x the price.


You can change the scaling in the control panel though.


Comparing laptops to tablets/phones is a bit silly though.

You hold your phone/tablet much closer to your eyes than your laptop is.

A phone needs to be pretty high dpi. A laptop doesn't.


> Maybe it'll be better on linux

Nope. Scaling is supported by Gtk3 (but you can only scale by integer - 2x, 3x) and Qt5. A lot of apps are still using Gtk2 and Qt4. It is a mess. I wish I had an FHD display instead of WQHD.


Well, if you use the gtk3 hidpi settings, yes. However, if you disable the HiDPI, scale it to 1x, and then set the font "Scaling Factor" in gnome-tweak-tool to something else, say, 1.25, (a non-integer!) it works great!


For Gtk3 apps it works reasonably well (it scales text, but not buttons, widgets, etc). There's nothing you can do with Gtk2, Qt4 apps though (other than changing font size).


Can't you just set it to FHD?


Not without having fuzzy text. I explained it once on reddit:

> Is there any difference between FHD monitor and WQHD in FullHD mode?

Yes, there is. WQHD is 2560x1440 pixels. If you set your resolution to half that (in width and height) it is 1280x720 (2560/1280=2, 1440/720=2). In that case every software pixel is represented by 4 physical pixels (2x2) and we are fine. If you set resolution to 1920x1080 (FHD): 2560/1920 = 1.33, 1440/1080=1.33 . And we know there is no 1.33 physical pixel! So the image becomes distorted and looks bad.


It doesn't have to look bad.

rMBP13 has physical resolution 2540x1600 (i.e. 1280x800 x 2). The control panel offers resolutions 2880x1800 (1440x900 x 2) and 3360x2100 (1680x1050 x 2), that the GPU will scale down to 2540x1600. And these resolutions look good and sharp.

In fact so good, that the final resolution depends only on what physical size / working space you prefer, not which one looks good and which one doesn't.


Keep in mind that Apple claims they developed custom scaling algorithms to handle these downscale modes, so a PC user doing the same thing using GPU drivers might not get as good results.


I tried 1980x1080 on my 2560x1440 screen. I don't think it looks very well.


That MBP has physical 227 DPI. What's yours? (If it is 2560x1440 at 24", it has only 122 DPI).


> That MBP has physical 227 DPI. What's yours? (If it is 2560x1440 at 24", it has only 122 DPI).

It is a 14 inch WQHD (2560 x 1440) display (Lenovo Thinkpad X1 Yoga)


Good point. What about GPU scaling though? Doesn't Nvidia and AMD drivers support that for lower than physical resolutions?


Not that I know of. Screen can be scaled with xrandr (on X11), but that significantly decreases performance. There's a chance of making it work on Wayland (modern replacement for X11) in the future.


Can’t you switch to some form of “Retina scaling”, e.g. 2x which result in an effective Retina resolution of 1600x900?


I got confused finding that option. It's on laptop with intel/nvidia gfx cards. I've seen something while back, where the gfx card provider would provide such scaling (e.g. done a bit deeper, hence probably working more as intended that not).

Yes, my concerns are primarily now on Windows 10 (Home Edition if that matters), since I'm not really bothered (but can see) such cases with my linux installs (where I work).

It bothers my son though, as some of the windows are very small (like as I said before EA's Origin client) :)


I use Windows 10; I’ve found something under Settings > System > Display > Advanced display settings > Advanced sizing of text and other items > “set a custom scaling level”. It lets you do 200%, but I haven’t tried it out, sorry.


I would try this sometime this week. Thanks!


Retina screens on MacBooks look amazing. As for VRAM being wasted, that's utterly irrelevant, there's memory to spare on any machine capable of 3D.

Windows 10 has better High-DPI support, but it's far from flawless. The real problem is the Windows software ecosystem is full of junk that hasn't been updated properly or was never programmed correctly in the first place.


I'm a "moar pixels!" kind of guy. I'm very happy with the latest generation of the Dell XPS 13, which has... lots of pixels and works great with Ubuntu and xfce. I was worried that there might be some issues, but I'm very happy with it.

I'm glad to see we finally broke through that wall of no resolution improvements that lasted several years.


A) Windows doesn't handle modern displays properly and has a huge library of problems relating to desktop scaling etc...

B) What was the context of your use of the computer? I.e. Are you gaming or logo branding etc...


Windows 10 - gaming - and some apps/websites. Another example was the mouse driver (after 3 deep down dives into the settings menu) - now its window look ridiculous. Out of place :) - first it looked a bit like Win95, or maybe Win2000/XP - then it was smoothed (blurred for some reason), and the size was completely wrong.

That was not a big deal, though. Really the big deals are the windows that come very small, and I've tried backward compability for them (individually) and it didn't help.

The only thing that helped as I've said, was to set up to exactly 2x lower resolution (1920x1200 or was it 1920x1080 - not in front of that computer to say exactly).

Funnily it now looks a bit more blurry (not sure why, I was expecting integer scaling), but after some adjustments it looks almost good now (not as crispy as before, but good enough).

Also for some games, no more tearing. Mainly games that don't real fullscreen (possibly with custom resolution), but somehow expand the window, and remove toolbars, etc.


2560x1440 is the sweet spot for me currently. It's a notch crisper than my old (now secondary) display (27" 2560x1440 vs 24" 1920x1080), but is perfectly usable without any scaling and is relatively inexpensive.

My MacBook Pro (13" 2560x1600) is noticeably nicer looking, but not really worth the additional hardware costs and scaling hassles (Windows 10 is better than previous versions [my Surface Pro 3 is set 150% scaling IIRC], but its far from flawless) IMO.


I was looking for a specific laptop for my daughter that came in both 1920x1080 and 3200x1800 and the former was almost completely sold out everywhere. So I guess a lot of people prefer the lower-res. I felt the same way because if you have two laptops with identical specs but one has 4x as many pixels to push -- that's quite a trade off.


It's also the price difference - 4K have their use, and people appreciate them for different reasons, but yes I was in the same situation, and I felt ripped off for not buying the cheaper 1920x1200 or was it 1920x1080 option. This money could've go for a bit bigger SSD, or maybe more RAM. He really doesn't need more than that.


In my case, the price difference was actually negligible. I just felt the lower-res display was the better choice given all the variables: weight, battery life, performance, RAM, thickness, etc.


Except for a few webpage/manual/latex, I'd be happy with orange/black text mode. Yay for symbols.


It's all about DPI.

Eg. 12" notebook with 1080p is great.


You are buying wrong laptops for the job and then being upset about them instead of blaming yourself for the poor choices.


The job was one for my father to use some CAD tools, and he switched to 1920x1080 (for other reasons) - and kept switching between Window 7, 9, 10. I thought it would please him better, but he just complained :), and then for my son - I just couldn't find same laptop with lesser resolution (after having exp with my father).

First was from Fry's, and the deal there (few years ago) was to get a laptop with dedicated GPU (8GB) + 12GB RAM, current one is only with 4GB but enough for gaming, but comes with 16GB RAM.

I myself am using an old ACER Chromebook (the one with PageUp/PageDown buttons), and crouton on it. It was the reason I started this conversation. I love chromebooks, croutonized or not, but I fail to see the reason for bigger resolution. I might come to my senses :)

The only monitor that works correctly with resolutions, is the OSX laptop my company gave me, out of choices for chromebook, linux, or maybe even windows (rare).


I wanted to make the cheapest possible development machine I could, assuming I did all my work on a remote server. So I bought a Lenovo 100s for $80 from Amazon (ok, it was $160, but I signed up for the Amazon credit card for the discount). It's one of the lowest end chromebooks you can get.

But it's awesome. The google docs/sheets/etc features are nice, and it's perfectly capable of web browsing. For development, if you work in the cloud, using Vim on a remote server, all you need is an SSH client.

For fun, I put Kali linux onto an SD card and I can use crouton to switch to the Kali chroot. So I can literally plug in an SD card and have a totally fresh "development machine" for $80.

It's great for traveling in developing countries because if it gets stolen, I'm out $80, not $1500.


Is there really any other use case for a chromebook given how cheap a decently-spec'd laptop is these days? I'm confused as to HPs strategy of making an upscale chromebook - who is the target market?


Home users aren't the target audience which is some of the confusion. And probably not developers either even if the capabilities suit many web/linux types. There likely aren't enough to make serious money.

People mainly see chromebooks in retail and think of them as cheap gifts for the grandparents so they don't have to do computer support and that is missing the bigger picture. What they may not have seen is the truck loads of chromebooks being sold to schools where low maintenance and security are as valuable as the price.

The high end Chromebooks for Work are part of a push into the enterprise market.

Chromebooks have a decent security model. The storage is encrypted. The os is signed. The system processes run in jails where possible and the browser processes are sandboxed. You don't have to worry about someone picking up some media on the street and plugging it in and installing a keylogger.

Combine that with extreme portability and low maintenance costs and if you are the sort of org that can move a lot of things to the cloud or remote sessions then there is likely a business case for them. I guess we will have to wait and see.

Back in 2014 there were stories that Woolworths, the biggest Australian retailer, was rolling out 8000 Chrome OS devices. That is the sort of market these higher spec devices will be targeting.


Maintenance. Why maintain a copy of Windows when all you use is the web? Antivirus, security, app and driver updates, bloatware shipped preinstalled, etc. Here you just pull it out of the box, log in, and it's good to go until the battery or other hardware wear out.


Well, that's fine for nontechnical users, but nontechnical users aren't buying high-end laptops anyway. High-end ($800+) laptops are bought by developers, sysadmins, or other people who need real native applications, which precludes Chromebooks. So these machines are in a weird market segment where they're too expensive for the vast majority of Chromebook buyers, but not capable enough to steal marketshare from, e.g. Thinkpads and Macbooks.


For what anecdata's worth, I see a huge number of non-technical people who buy high-end laptops (specifically Macs), and as far as I can tell, it maps fairly well to the individual's socioeconomic status. Everyone from baby-boomer parents to social workers. It's why you see so many Macs and iPhones as product placement ads within shows whose target markets dwarf the techie population:

http://www.businessinsider.com/apple-product-placements-in-t...


Yes, but I think a non-technical person who buys a $1000 macbook is extremely unlikely to buy a $1000 chromebook. The main appeal is obviously the brand and colours(my sister wants a new macbook purely because there is a rose gold one, she doesn't care about specs). A Chromebook is no different(in their eyes) from a $200 Acer - it's a "non-apple" laptop so it doesn't matter how well made it is or how great it performs.


My argument is that developers do not need high end notebooks. Unless your development work requires specialized hardware (e.g. iOS development requires a Mac), there is little need to develop locally. Most of the time, you're targeting servers anyway. Why not develop on a server too? Also, when you start a new project, you can pick a cloud development box that matches exactly the specs you want, whereas you're still stuck with the same laptop from project to project. So in that sense, cloud development offers more flexibility than local development. And it saves you money on expensive laptops!

There's also the self-satisfaction of knowing you can code anything with any $100 laptop and an Internet connection.


I use a Chromebook as my main dev machine these days. It is wonderful to finally have a fanless laptop, a 15 hour battery life and masses under 1kg. Except I nuked ChromeOS for real Linux and carry a few gigs of documentation to make internet optional.

Some devs get it and see how it could work. They mostly get hung up on the keyboard, the lack of home/end/pgup/pgdn. (Readline keybinds are my best friend now.) Other devs are flabergasted that you can do anything of value on a machine without at least 16GB ram and multiple monitors.


That's fine if you're coding without an IDE and if you can tolerate low-resolution screen. I used your extremely tolerant of such things, but now I'm 30 and thus a senior citizen in the tech world. I need my creature comforts.


How much are you spending per month on that cloud dev box? What do you do for Android development?


And it saves you money on expensive laptops!

Are you accounting for the cost of the cloud dev box?


One can turn off e.g. EC2 instances when one isn't using them, and even the nano/micro instances are perfectly capable for most dev tasks, so "accounting" for this cost might be a waste of time.


If I'm using a nano or micro, I might as well code directly on the Chromebook, which at least has 2GB of RAM. The point of using a cloud box, from what I understood it, was avoiding the need for an expensive/powerful laptop, and for that you need at least a t2.medium, in my opinion.


I'm somewhat incredulous that anyone needs that, but in that case then you need to budget 40¢/day.


You're incredulous that some software stacks might need more than 2GB of RAM to run?

In any case, those 40c/day come to ~$130 extra/year, with which I instead bought a used Thinkpad X220, which comes with a decent CPU, supports 8GB of RAM and weights about the same anyway. I simply don't get Chromebooks.


Well sure there systems requiring any particular resource level. Long before the requirement is 2 GB ram, one might wish to get that on a server if for no other reason than to minimize installation hassles. Clearly, preferences vary.

Do you develop on any particular project 325 days a year? That's hardcore.


Why wouldn't non-technical users buy high end laptops? Even just for a metal casing, better appearance, or performance in day to day use (e.g. gmail), battery life, screen, etc., higher end laptops have plenty of ways to appeal to non-technical users.

A lot of it likely comes down to price sensitivity of the purchaser, which isn't linked to their level of technical skill.


why do sysadmins need high end laptops? all they do is ssh?


If opening a new credit line and getting a hard hit on your credit report is worth $80, fine. But in absolutely no way was that a product discount. Amazon credited you $80 for opening a credit card with them. All credit cards have rewards. But it had nothing whatsoever to do with the cost of that product.


I agree the 80 dollar discount is irrelevant to the Chromebook but...

Is signing up for a credit card and for your credit report? I would have thought that as long as you pay it off promptly each month it's an overall benefit to your credit score?


Every time you apply for a new card, you get a hard pull. Credit agencies don't like too many hard pulls in a short period of time because it looks reckless to open so many new lines of credit at once.

It's unlikely that opening a new credit card from time to time will do serious damage to your credit. And over the long term, having a lot of credit available can be beneficial to your credit score.


Why do people in US care about your credit score? What difference does it really make? Genuine question.

Here in Australia it seems the banks want to shovel credit cards down our throats no matter how much debt we are in or how bad we are at paying.


Some employers will even decide who to hire based on their credit score. Bad credit = can't get a job. John Oliver did a bit on the whole credit scoring system scam https://www.youtube.com/watch?v=aRrDsbUdY_k


The credit score is used for all sorts of things in the US, from the interest rate on your mortgage to qualifying for renting to even some employment checks. It's fucked up, especially since the formula for calculating the 'score' differs from one agency to another and is deliberately kept obscured.


It also seems kind of like voodoo; scores change up and down over time with zero change in behavior.


Yeah that's why I really don't pay attention to it. Plus the fact that it is so easily gamed: if a party wanted to (such as a realtor or someone else with access), they could just run hard credit checks on you all day and tank your score.

To illustrate how fucked it is, I use a credit tracking app from my bank and it consistently reports scores of 700 and above yet I've had realtors tell me they weren't sure if I could get an apartment because of my credit score - showing up with enough cash for the first six months of payments quickly resolved that issue.


It also lowers your average length of account. If you have two cards for 10 years, your average account age is 10 years. If you open another account, that instantly drops down to 7 years.


That is definitely not true all the time. Chase didn't do a hard credit pull to give me my MileagePlus credit card, which was definitely worth the effort. Besides, the hit from a credit pull is only temporary, so unless you are planning on buying property or changing rentals in the short term, it's basically negligible.

People are a bit too worried about their credit scores. Just make sure to not screw up too badly and you should be mostly OK.


Yeah I understand. I monitor my credit obsessively and have a good idea of what happens to my score in any given scenario. In this case I wanted the additional credit line anyway so it was worth it. I'm still young and at the point where any additional line of credit is a long term benefit for me (decreases utilization percentage, increases number of accounts, increases total number of on-time payments, and long term will increase average age of accounts). Also I like the rewards.


You don't even need the SD card, I got a 300$ Chromebook that I run Ubuntu on, even got High Level Assembly to run


I thought "High Level Assembly" was a joke about getting the GDC or LLVM to run but, found out it's actually a language.


The SD card is 3x the size of the chromebook's hard drive! ;-)


Have you found a good SSH client that runs on a chromebook? Every one that I tried has atrocious draw speeds, meaning that if you ran a program that generated pages full of output, the terminal would effectively lock up and become unresponsive for long periods of time.


Mosh should solve that problem.

I'm running in a chroot so installation is just a matter of using the package manager of the "guest" OS (scare quotes because chroot is technically same OS).


Mosh wont do anything to fix rendering lag..


I never understood Chromebooks. My used Thinkpad cost about the same (without the discount), it's about the same size and has much better specs, particularly RAM, so I can actually work fine offline.


I used a Chromebook with Crouton before the X220 massively dropped to the same $80-100 OP is talking about. When that happened I bought 3 of them in one deal and some 9 cell batteries; I'm happy as a pig in shit. Using ubuntu/i3wm I get 16+ hours from it consistently and when it's empty I just pop in another battery.


Developing what though? Web stuff? What editor do you use?


Mostly systems level stuff, but it's not my main dev machine. I used it when I was on vacation and knew I might need to put out fires on remote servers. i.e., I only needed command line access.

Theoretically I can do all my development in vim on a remote server... I think there is zero reason that any development needs to be done locally, unless you're working on an iPhone/android app (in which case a chromebook is useless anyway).

In my ideal world I liberate myself from local development completely, and my computer is just a dumb terminal to my cloud development machine.


The windows 10 seems to have a better keyboard (page up/down next to the arrow keys, something I absolutely love) for a similar price.

http://www.bestbuy.com/site/lenovo-ideapad-100s-11-6-laptop-...


Does anyone know what is the possibility/progress of having full fledged compilers and interpreters running on Chrome OS in the future? Or is that never going to be reality, because of the goals of ChromeOS?

It is a great OS, but very light on software development options.

I know there is always Crouton or a native linux install, but out of the box support would be awesome.


There is NaCl Development Environment (1) with bash, make, git, gcc, python, ruby and lua working. It is still very much WIP and it does not work on ARM Chromebooks. But if Android play store apps will be available there will be much more choice probably.

(1) https://chrome.google.com/webstore/detail/nacl-development-e...

Edit: it seems that ARM Chromebooks are now partially supported. I will have to check on mine.


Developer mode. Chrome OS is Gentoo underneath. Most people just use Crouton because Ubuntu/Fedora/whatever are the go-to Linux distros.


Well, isn't Crouton also a more stable way to play with a Chrome OS system? If things go wrong, can't you just blow away the Crouton chroot and keep the stock Chrome OS install (with updates) intact?


That is the case. I personally love Crouton for development on a pretty old chromebook.


There are many browser-based IDE's where you edit code on a remote machine, but I don't think any are mainstream yet.

Personally I'd be fine with a nice editor and a terminal window. Suppose there were a way to get Atom to run in this mode?


Well, seeing as how these are designed for always on internet connections, I'm guessing people will simply remote into a box at home or work to do the compiling.


Wouldn't you just install Linux on it instead?


This is what I do on my chromebook pixel (2), but sadly the audio driver (really just the changes needed in the rt5677 codec) never got mainlined:

https://bugs.chromium.org/p/chromium/issues/detail?id=573776

While you would expect linux to just run out of the box on these machines, sometimes even with it running some variant of a linux kernel from the OEM things aren't always so nice.


Me personally? No, I don't mind remoting into my server.


I'd love to see this, and my daily laptop is an Asus C300 Chromebook, but I think it's more likely we'll see dependence on cloud IDEs/environments rather than much native work. If they could get me a Ruby install and Android Studio on there I'd be a happy bunny.

This HP one looks really nice, and might solve a couple of gripes (last couple of ChromeOS updates have slowed it down a bit).


There are decent Android IDEs available if ChromeOS does get the Google Play store as recently rumored.


I'm really happy using Crouton on a Pixel 2 as my main dev machine.


More likely you'll SSH or RDP to a desktop/server environment somewhere, but there are developer options (crouton) if you need more in the box.


check out chromebrew https://skycocker.github.io/chromebrew/ It's a package manager for chrome os (requires dev mode). It's intel cpu only though; but it has a lot of the dev packages you need.


I run crouton on my chromebook but really if you intend to run desktop linux don't get a chromebook, even at boot waiting for the "developer mode" beeps are annoying besides the other compromises (small native storage etc).


I tried crouton for a day or so, didn't like it. Why run on top of ChromeOS if you don't have to.

I flashed Seabios on the C720 and just installed native Xubuntu, with a little configuration and driver installation everything works pretty perfectly. There is no initial beep or scary screen and no way to wipe the laptop with one button on boot. Johnlewis does some great work on these builds[0]. Give him some money.

I learned about this later, but could have maybe saved myself some configuration time[1].

Of course it would be much nicer for Acer, Google, HP et. al. to provide a linux build, but I don't really see that happening.

[0] https://johnlewis.ie/custom-chromebook-firmware/rom-download... [1] https://galliumos.org/


I bought a 2013 Chromebook Pixel from somebody. One of his reasons for selling it was that he had enabled developer mode and his child hit the space bar on boot up. That wiped all of his work. He was going back to a laptop.


Did you tell him about DVCS?


Try pressing "Ctrl+D" at boot to avoid the beep and wait.

In any case, I hardly ever turn on/reboot my chromebook, mostly just resume from sleep.


I got myself an Acer ES1-111, snapped out the 2G DDR3L module and put in a 4G one. It's priced like a Chromebook but has a full keyboard (a lot of my Emacs stuff uses the "super" key). Sure the eMMC space is not large and you won't see me lugging my music collection in it (or crunching large datasets), but for light development (I do mostly Python/Django) it's just fine. And for USD 200, I won't be devastated if it's damaged.

Not really comparable to a high-spec Chromebook, but does the job.


An undocumented (as far as I can tell) way to avoid the annoying beeps is to press ctrl-d on boot. This avoids the developer mode beeps. [source: I'm also a crouton+chromebook user].


With Chromebooks I typically do not need to worry about hardware compatibility. Besides, with Crouton I can have full-fledged development environment under CromeOS while still being able to give the notebook to guests or children for Internet access.


My biggest problem with these Chromebooks is actually the limited local storage - this hardware with more local storage would be a killer linux machine.


In a lot of chromebooks, the storage is replaceable. It's just a normal-sized SSD that happens to have a very low capacity.


Most chromebooks don't have replaceable storage. Acer seems to be the only one making new chromebooks with replaceable storage. Also, eMMC isn't an SSD

https://wiki.archlinux.org/index.php/Chrome_OS_devices/Chrom...


Sure, but the Pixel last year did not have a replaceable SSD, only some of the cheaper/slower models did. I don't know if this one does.


Not anymore, almost all of them today are based on Intel's SoC architecture that can use embedded eMMC memory (basically a SD card on a chip) instead of a SATA SSD that you can replace. It's a cost cutting thing.


Yes, they are just trying to squeeze as much money out of you as possible by forcing you to use the cloud... the storage specs on these chromebooks feel like the 90s to me.


Some of them have SD card slots too.


I'm looking at getting a new laptop this summer, and I've been leaning towards a Dell XPS 13, but this chromebook is intriguing to me. I had a an Acer c720 chromebook for a while and I had a really good experience running Chromeos and Ubuntu through Crouton. Having 16 gb of RAM is an important feature to me, and I'm disappointed the non-touch XPS 13 doesn't offer it. I'm probably also all right with having limited on-device storage because usb 3 is fast and external storage is cheap.

How capable is the Core m7? Is it hefty enough to drive a KDE desktop smoothly? What about playing light linux games like Civ 5 or Crusader Kings?


The thing that kind of annoys me about the XPS is the location of the camera. It's in the bottom bezel of the monitor. They should have simply not put a camera in the machine because all you will ever get are upshots of your nostrils, closeups of your hands while typing, or the bottom half of the image being a close up panoramic shot of your keyboard if you try to bring the camera up to eye level.

I don't know what Dell was thinking, but clearly they weren't thinking much. It's horrible.


> What about playing light linux games like Civ 5 or Crusader Kings?

It appears to use the HD Graphics 515 processor, which seems to have performance in the same ballpark as Intel's HD4000/HD4400 graphics processors. I've played Civilization V on both those (HD4000 OSX & HD4400 Windows), and it runs fine on the lower end settings. I don't see why the HD515 would be significantly different.


The new ThinkPad 13 might be worth a consideration if you're not set on a top screen or aluminum case.


Thanks, I was definitely looking at Thinkpads also, but I hadn't taken a good look at the thinkpad 13 yet.


> What about playing light linux games like Civ 5 or Crusader Kings?

It uses an integrated Intel graphics chip -- it won't be able to do any decent gaming.

At the 16GB price point of this thing ($1k), you could get a more conventional laptop instead of Google Chrome/Pixel OS systems and get a lot more power and flexibility out of it.


> > What about playing light linux games like Civ 5 or Crusader Kings?

> It uses an integrated Intel graphics chip -- it won't be able to do any decent gaming.

Depending what you mean by decent gaming. I've played Civilization V on Intel Graphics (HD4000 OSX & HD4400 Windows), and it runs fine on the lower end settings. Not ideal by any means, but the newer Intel Graphics chips are OK for some games.


I think you missed that they used Ubuntu on their previous Chromebook (I do too) and so would probably do the same with this one


Still no more info on storage? I'd buy the $499 model in a heartbeat if it had a 32GB SSD (or if it was upgradeable). The fact that they haven't released those details yet makes me think the number will be rather unimpressive.


According to the Ars article[1] I read, all models have a 32gb eMMC storage drive.

[1]: http://arstechnica.com/gadgets/2016/04/hp-and-google-are-mak...


Likely eMMC :-( Otherwise they would have stated SSD/M.2


Now you can run 16 Chrome tabs at once!

Joke aside if Google get the ChromeOS Android merge or more desktopification of ChromeOS done soon this will be a good to have Chromebook.


Run lots of tabs with https://chrome.google.com/webstore/detail/the-great-suspende...

I saw this extension recommended on another thread and it's worked out very well for me. It suspends tabs that aren't being used after X amount of time, normally 1 hour. Activity Monitor on my Mac showed a huge difference in CPU activity.


They're working on enabling tab discarding (from mobile Chrome) on the desktop. Check out the tab discard experiment, but it doesn't have automatic discarding yet, just manual through chrome://discard.

They want to add dom state serialization before they enable automatic discarding.


Eh, I wouldn't go crazy. It's only a dual-core Core M processor after all.


What i would love is a 11-12" chromebook that has ~8GB of ram and an expandable SSD option, a good screen and is military spec so it can take a beating when i go travelling.

My current macbook has a very narrow operating environment (10-35 degrees centigrade) and can't withstand being taken camping on the beach.


I want a chromebook but all the ones I've seen are the price of a good laptop, or crap. Why can't I pay £200 for a good one? What about £300 then? Nice screen, 6 or 8 gigs of ram, a cpu more powerful than in the average phone? They seem very expensive for what they are (a midrange phone in a laptop body, as far a I can tell).


16GB of RAM for running Chrome. Sounds about right!


This hardware looks great and the specs are bang on. How well would this run Kubuntu?


I would guess it would run it fine. My C720 has significantly reduced specs from this and I don't notice any problems with ubuntu mate


How is a Core M considered premium?


Best CPU that doesn't require fans? I love my Core M-based ultrabook with 3k screen and zero noise while compiling (and performance is on par with Ivy Bridge 3517U).


It's faster than you'd expect, despite the very low TDP. I have the base model 2nd gen 12" MacBook, and I haven't had any issues with anything feeling slow on it.


I've been looking at the 12" Macbook line since it released and I'm curious how it performs dev work - do you run XCode on it? How's the form factor working for you overall?

Sorry to derail the topic - just genuinely interested in the thoughts on that hardware!


I primarily use Atom and do JavaScript + Laravel development (with a Homestead VM always running!) and for that it performs quite well. I haven't tried running XCode yet, but from what I read before I bought it, I doubt it will be much of an issue.


I'd honestly rather have slower compilation time than hearing the sound of a jet engine.


I'm writing this on a Core M. It really doesn't feel like 800MHz. Compiling stuff might be a bit slower, but the main performance bottleneck in most modern PCs seems to be disk IO anyway, rather than raw processing speed.

If you're doing video rendering, or using a bloaty IDE etc. then you might need something more powerful. For general use though, a Core M is completely fine.


It is premium in terms of power efficiency in that performance range. Nothing else is close.


Can you completely strip Chrome OS from these Chromebooks? Including BIOS-specific stuff like automatic rewriting of modified base system? So that you could boot straight into Linux without worrying your OS will be gone if you forget to press some key upon boot?

BTW, flash-based storage (likely eMMC) for >$1k is just eek.


The last premium Chromebook, the Pixel, yes you could install Linux and just boot into it.

Problem was, Linux... Debian was not ready for this device, especially the high res screen, which made the UI in Linux super duper tiny. Debian saw the screen as a giant regular resolution screen...

Lots of edge cases like this, especially with touch screens.


Thanks! I was asking because I would like to have a dedicated "retro" machine for running all 8-bit/16-bit and console emulators and a Pentium 4405Y would be capable of running all of them smoothly (Baytrail/Braswell are too slow); 32GB eMMC would be sufficient as well.


Frankly, if yer looking for a portable way to play emulated games and old computer stuff, the nVidia Shield Portable is amazing. It's like someone at nVidia saw all these GP32's and weird handheld emulation devices and said," geez, we could make that actually work."

Alternately, an Intel NUC or a compute stick works, too. As do Raspberry Pi. I would caution against making this kinda device into an emulator box. You want generic hardware.


For Dolphin you really need something based on Core M or better to run smoothly (CPU intense). I have a Shield TV and never considered it for it; 8-bit/16-bit would run there smoothly I am sure though. If Pipo KB1/KB2 had Core M, they would have been ideal both performance and retro form factor-wise ;-)


This won't be suitable for bitL if they want Core M, but a super-interesting device in the same performance and size bracket as the Shield (but a much stronger focus on general purpose mobile computing) is the Pyra, which just started taking preorders a few days ago and will likely ship in a few months.


fedora and ubuntu have worked really well for me on high dpi screens. Unity and GNOME didn't require any tweaking.


edit: reread your question, probably not. Most chromebooks can't have the ROM totally overwritten.

most Chromebooks can run Linux by writing by writing over a sector of the ROM. Most (and probably this laptop) unfortunately can only write to the RW_LEGACY or BOOT_STUB which introduce significant problems. I'd definitely just run crouton for these.

https://johnlewis.ie/custom-chromebook-firmware/rom-download...


All chromebooks allow to completely overwrite the firmware.

It requires opening the machine though (and then turning a screw), both as a crude security measure (it's not a drive-by attack if it takes 10 minutes of physical access) and a stability guarantee (without the modification, the system basically can't be stuck, no matter how broken an update might be).

Disclosure: I work on Chrome OS firmware.


I suppose I've been reading third party documentation.

https://wiki.archlinux.org/index.php/Chrome_OS_devices/Chrom...

Along with the link above indicates otherwise. Are these websites both wrong?

I did completely overwrite the firmware for my Acer c720 (by removing that write protect screw) but I assumed that was an exception.


The write protect screw (or a similar mechanism) exists and works the same way on every Chromebook.


3K display?? If only the keyboard had IBM style page up/page down keys next to the arrows, at this low price I would buy it right now. All I want is something I can use to SSH, with a great screen and keyboard. Bonus points if it has LTE, a stylus, or is between 12 and 9 inches.

I can't shake some learned habits, so I decide on the keyboard first. The closest thing I could find was the latitude 13, but the price is a bit high. For that much, I might as well get a Lenovo Yoga 13 with the AMOLED display when it becomes available.

But then it gets too big. The best laptop is the one that follows me around. At the moment, my eyes are on the Ideapad miix 310. Looks like it has a great keyboard, and LTE. But no 3k. And no 16G option. (and no upgrade possible). 16Gb of RAM would let me do the crazy things I sometimes do on the side...


Dear industry: I want a 7" tablet that has 2GB to 4GB of RAM, and costs $99. I do not care about battery life (will always be tethered to charger), storage (so 16GB for Android or 32GB for Windows is fine; the absolute minimum either can realistically not choke the tablet to death with), and CPU/GPU is I only partly care about (Cherry Trail era x86 SoCs used in some Windows 7" tablets are fine, and they go for $99 to $199).

The reason why I ask for that? $99 gets me 1GB + 16GB usually, I have to go the whole way up to $199 or more to get 2GB. Modern OSes just are not really suitable for 1GB, I don't know why these devices even exist, they can't actually be used.


I'd love an engineer rugged tablet. 8", 8GB RAM, a few usb3.0 type C ports, a few sdcard sockets, microhdmi, not slim, with a nice battery, gps, accelerometer, nfc.


You want one of the myriad of super cheap tablets from China: http://techtablets.com/


My Amazon Fire works pretty well flashed to stock Android, certainly worth the $35. Once you get rid of the Amazon bloat it runs swimmingly, even with only 1gb of RAM.


I've read a few reports of sluggishness ? are you a frugal user , or are amazon apps sucking cpu ticks ?


I have an original generation Kindle Fire, with the 512mb of RAM.

Yes, even back then, I forcibly reflashed it with a community build of CyanogenMod (started with 4.0 when Amazon's was based on 2.3, now running 6.0) because of how bad Amazon's default ROM murdered performance, even before I ran anything.


Interesting, the actual model might really be worth 50$.


It's so hard for me to suggest people get a chromebook. I've had mine for close to two years and still find myself avoiding using it at all costs.

It's fine for writing on but that's really it.


Really? Do you code for a living? I ask because I am a technical marketer at a startup (i.e., I have to read some code, but mostly my work is done inside Gmail and Google Docs) and could totally work fine on a low-end Chromebook when my Thinkpad was out for a week in repair.


I'm not a developer. The most frustrating part of the chromebook is having to be connected to the internet to do almost anything. I have desktops for everywhere I'm usually working but laptops for me are mobile devices. Sometimes/often I dont have wifi but still need to get work done. Sure google docs kinda works offline but its not great.


I wonder what a 3k display will do to I assume the rather weak GPU on the Core M. It was only 2 or 3 years ago that Core i5's integrated GPU could barely handle the retina Macbook.


>> HP offers its Elite USB-C Docking Station ($149), which plugs in to a USB-C port on the PC and enables to connect up two Full HD displays, Gigabit Ethernet as well as multiple USB Type-A devices, such as keyboards or mice.

Good to see high quality USB-C docking stations coming thick and fast. Wonder how much testing they have done on the Macbooks for compatibility...


I'm a bit surprised the dock isn't generating more interest at the stated price point. You'd pay a lot more for the non-thunderbolt USB 3 Dell dock so it seems like a good alternative. I've no experience in these docks/adapters, won't they be compatible and just work (with an Xps 13, for example)?


It was unveiled last week, and anandtech cited sources: http://www.engadget.com/2016/04/28/hp-chromebook-13/


Now, if I could only get a normal laptop with those specs. PC laptops are so hard to find with resolutions >= 1080p, let alone a decent SSD.


So is this designed to supplant the Pixel line, or just provide a "premium" Chromebook designed for businesses, rather than developers?


From the title, I thought it was going to have only a single usb-c port like the macbook, thankfully it has 2 usb-c ports and a usb-a port.


Looks like really decent hardware for a Linux laptop though.


How good is GNU/linux support on chrome books?


Core M? That's 2015 tech, using selling points such as 16gb of ram is weak as 16 Gb is so common these days, why are they talking about '3k' rather than PPI or another volumetric measurement of density. Just feels a bit old news?


16GB RAM is not really common in laptops, especially Chromebooks. The new Macbook had 4GB last year and 8GB this year. For the price it is a pretty good offering if you fit into the (probably small) customer base.


What? My 2011 MBP shipping with 16GB, yes I chose it over the 8GB option but it shipped with it! My desktops both have 32GB because... RAM is really cheap and it really makes a difference when you're running a lot of VMs / Containers etc... or you know, any of the web browsers available in 2016 that are all awful thanks to javascript and friends


Core m. The case is a big difference. Big M was Broadwell, little m is Skylake. So these procs are on a modern node.


The news is how badly HP is gasping for air.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: