Hacker News new | past | comments | ask | show | jobs | submit | giobox's comments login

> You can buy it but you can't run it, unless you're fairly wealthy.

Why do I need a colocation service to put a used 1U server from eBay in my house? I'd just plug it in, much like any other PC tower you might run at home.

> Unless you're really wealthy, you have your own home with sufficient power[1] delivery and cooling.

> not sure where you live, but here most residential power connections are below 3 KWh.

It's a single used 1U server, not a datacenter... It will plug into your domestic powersupply just fine. The total draw will likely be similar or even less than many gaming PC builds out there, and even then only when under peak loads etc.


> Compelled speech, and compelled work, are both disallowed by the US constitution... Apple successfully used this argument several years ago when the FBI tried to demand that they break a phone for an investigation.

I'm not sure this is how the San Bernardino case actually panned out:

"Apple declined to create the software, and a hearing was scheduled for March 22. However, a day before the hearing was supposed to happen, the government obtained a delay, saying it had found a third party able to assist in unlocking the iPhone. On March 28, the government claimed that the FBI had unlocked the iPhone and withdrew its request."

The arguments were never actually tested in court, the whole thing was quietly put away once the FBI found another way to unlock the phone.

> https://en.wikipedia.org/wiki/Apple%E2%80%93FBI_encryption_d...


Yes, in this case: successfully used this argument to delay until the FBI gave up.

If it had gone to court, the argument was considered strong, but of course no one knows until a verdict is reached and appeals are exhausted.


This is a gross simplification of the many factors that lead to the FBI dropping the demand against Apple, in my opinion.

But it is everything we know.

The expectation was that FBI would lose in court. But that was not guaranteed, certainly.

FBI had multiple reasons to abandon the effort, but one was that if legal precedent was established at that time, for that case, it would be harder to bypass in future cases.


I expected the FBI to win in court because the FBI had precedent on its side. The judge had asked Apple to provide reasonable technical assistance to access data on the phone, and modifying one line of code fits well within the judge's request.

This simply doesn't work much of the time as an excuse - virtually all the AI tool subscriptions for corporations provide per user stats on how much each staff member is using the AI assist. This shouldn't come as a surprise - software tool purveyors need to demonstrate ROI to their customer's management teams and as always this is in reporting tools.

I've already seen several rounds of slacks: "why aren't you using <insert LLM coding assistant name>?" off the back of this reporting.

These assistants essentially spy on you working in many cases, if the subscription is coming from your employer and is not a personal account. For one service, I was able to see full logging of all the chats every employee ever had.


The very second someone starts monitoring me like that I'm out. Let them write their own software.

It's not necessarily just monitoring though. I actively ask that question when I don't see certain keys not being used to inquire their relevancy. Basically taking feedback from some engineers, and generalizing it. Obviously in my case we're doing it in good faith, and assuming people will try to get their work done with whatever tools we give them access to. Like I see Anthropic keys get heavily used among eng department, but I constantly get requests for OpenAI keys for Zapier connects and etc. for business people.

Definitely a place for this in the market, especially given the whole fiasco around the secret BarTender change of ownership:

> https://www.macrumors.com/2024/06/04/bartender-mac-app-new-o...

I hate when OS utilities are closed source like BarTender to begin with, so a new open source alternative is ideal.

The worst offender in this realm for me is the widely used FanControl app on Windows, for managing CPU/case fans etc. The app is completely free of charge, but developer hides source.

> https://getfancontrol.com/

The github link for fancontrol is just used to upload a release zip - no source available to view. Why, when tool is freeware? There are no good reasons to obfuscate etc etc.


If the lights have adjustable white balance, you almost always can sync light color temp to time of day with home assistant's flux:

https://www.home-assistant.io/integrations/flux/

This works with almost any smart light you can get connected to HA (most smart lights have white balance adjust), it's brand agnostic etc. I have lights from several different vendors working together this way.


So I need some smart light that can be dimmed from power source. Which are those?


Even better, get smart switches that don't use wifi or IP addresses. I'm personally of opinion my homes core features should not rely on needing IP addresses, working DHCP or DNS etc just to turn a light bulb on and off.

Home assistant works amazingly well with zigbee devices, and these are plentiful and cheap etc, and don't rely on working wifi/IP infrastructure. When I sell up, my zigbee switches will work just fine as plain-ole light switches even with all my Home Assistant infra ripped out, leaving no issues for next buyer.

You can add zigbee support to pretty much any Home Assistant setup with a 20 buck USB adapter, Home Assistant even make an official one:

https://www.home-assistant.io/connectzbt1/

The also sell Home Assistant servers with zigbee radios built in:

https://www.home-assistant.io/yellow/

The light switches are often cheaper than wifi equivalents too. Wifi bulbs should really only be considered by renters IMO - people who can't easily replace wall switches or similar.


Except IP works far better than Zigbee's alleged mesh networking, and all the other home network technologies because somehow home automation is a special snowflake that can't use the same network technology everybody else uses.


There are a few reasons why Wi-Fi is not my first choice:

* I don't trust any company to use my Wi-Fi and not attempt to access the broader internet. A Zigbee or Z-Wave device isn't going to be able to stealthily update itself in anti-user ways, nor is it going to be hijacked into serving as part of a Bitcoin botnet.

* There are way too many devices, which can cause issues if they're all using Wi-Fi at the same time. Smart homes take a router that would normally be dealing with 2-4 phones and 2-4 laptops and add N bulbs, M switches, P contact sensors, Q motion sensors, and assorted random sensors. Not a chance am I hooking that much up to my Wi-Fi.

Z-Wave LR has worked very well for me—no mesh to worry about, just a controller and devices. The only downside is that it's not as broadly supported as zigbee or Wi-Fi.


Put your IoT on a separate wifi SSID and firewall it.

My devices can't reach the internet at all, but I have easy access to them the other way.


It's easier and has better guarantees to just... not have them on a Wi-Fi at all.


That's not an advantage of switches though. My smart bulbs are Zigbee, too.


The main drawback with keeping the switch "dumb" and only using smart bulbs is someone can turn off power to the bulb etc, which is why I and parent commenter focus on automating the switches. If someone turns wall switch off and its dumb, you can't turn the "smart" bulb on with home automation regardless of what tech is used inside it. Focusing on automating the switch generally has best returns on making most dependable system, as you will always be able to get the light back on. Again, I only recommend smart-bulb only if you are a renter or similar and can't mess with your switches.

Zigbee access to the bulb is great for stuff like changing whitebalance etc though. In my own home I have the bulbs and the switch on ZigBee so I can do this, but power on/off is solely preserve of the automated switch.


What zigbee switch do you recommend ? I was considering Sonoff zbmini.


I've had great results with the Aquara zigbee stuff - almost all of them work fine connecting to HomeAssistant via generic zigbee USB adapters, and can be found online pretty cheaply. I have >50 of their switches and sensors at the moment.


Aqara is a bit of a mixed bag. A lot of their switches are not Zigbee certified and don't conform to the standard. Specifically, they won't bind directly with devices from other manufacturers.

This might not matter if you're pushing everything through a hub like HA, but if you want to connect directly with other devices and remotes then it likely won't work.


I also ran a dual Celeron system overclocked to 450mhz - it was great value in 1999. Abit even launched a motherboard that let you run dual Celerons without modifying the processors, the legendary BP6:

https://en.wikipedia.org/wiki/ABIT_BP6

This was first board to let you use unmodified Celerons, the "hack" to let dual CPUs work with those chips was performed at the motherboard level, no CPU pin modifications needed.


The real problem with this setup was that a vanilla Pentium 3 would run circles around the dual Celerons. I had my Celerons clocked to something ridiculous at one point like 600MHz and still could not beat the Pentium.


You are forgetting the massive price difference though. For sure a P3 was great if you had an unlimited budget, but a quick look at pricing sheets for September 1999 shows a 600mhz P3 at ~650 dollars.

The 300mhz celerons, easily over-clockable to 450/500mhz, where only ~150 dollars each. These prices are in 1999 dollars too, I haven't adjusted for inflation.

It was the value proposition, not the outright performance that made dual celeron builds attractive, especially in an age where we were having to upgrade far more often than we do today to keep up with latest trends.

In 1999 I vividly remember not being able to afford a P3 build, was largely why I ended up with the BP6. The P3 also had significant supply issues throughout its lifespan, which didn't help pricing at retail either.


I recall some kind of Intel trade show event where attendees got to buy a P3-500 and an SE440BX motherboard for $250.

It was awesome and was my main computer for years.


Funny to see movie people going through the same pain PC gamers did (lamenting the loss of "big box" retail games - the reasoning is much the same), especially the complaints around the loss of things like booklets etc.


So many of these things are purely nostalgic.

I remember building a computer 8 years ago that for the first time I didn't need an optical drive. Not needing to dig around for 3-4 separate disks or worry if they were scratched was very handy.

Yes the ownership piece is one thing, but I have yet to play a game in over a decade that was eventually taken away from me.

Why do I want the booklets when there are other incredible sources of game info online, from Youtube tutorials though to subreddits and wiki articles.


weirdly I'm more in favor of it with steam - I guess there's not a really great equivalent for movies and music? I mean band camp is pretty close to gog but it's a little bit of a hassle to manage your library still, and easier to keep CDs in the car.


Steam works great while the benevolent dictator (Gabe) is still alive and in direct control.

What happens to the company and its services once he dies is extremely concerning to me given depth of my own investment now in game licences on the platform. Things may not be so rosy if/when Steam is sold to a publicly traded parent and has to do boring things like deliver quarterly growth for shareholders etc.

Would a Steam/Valve owned and run by Microsoft be as good? It might not be Microsoft in the end of course, but that question is going to come up a lot sooner than a lot of Steam fans probably realise, especially as Gabe ages out.

Big box games generally had very little of these types of concerns.


Right exactly. Any discussion about this topic that doesn’t separate LFP (lithium iron phosphate) based batteries from the wider discussion about lithium ion is missing one of the biggest changes to battery tech in recent years. Not all lithium based batteries share the same fire risks - there are enormous differences now.

Increasingly EVs and especially backup batteries use the LFP tech - you can charge it to 100% without harming the battery unlike previous lithium ion batteries, and they don’t really catch fire.

This paper even acknowledges LFP is significantly safer, mentioning it once, but doesn’t dive into the significant improvements. Again, many EVs (including brands like Tesla etc) already use LFP packs in many cars, and the usage is only increasing. Tesla’s generation 3 powerwalls (home backup battery) are LFP too, its really taken off for home power storage for very obvious reasons - they don’t really catch fire, and you don’t need to worry about charging to 100% harming the battery over time.


For sure its been a sweet spot for a very long time for budget conscious gamers looking for best balance of price and frame rates, but 1440p optimized parts are nothing new. Both NVidia and AMD make parts that target 1440p display users too, and have done for years. Even previous Intel parts you can argue were tailored for 1080p/1440p use, given their comparative performance deficit at 4k etc.

Assuming they retail at prices Intel are suggesting in the press releases, you maybe here save 40-50 bucks over an ~equivalent NVidia 4060.

I would also argue like others here that with tech like frame gen, DLSS etc, even the cheapest discrete NVidia 40xx parts are arguably 1440p optimized now, it doesn't even need to be said in their marketing materials. Im not as familiar with AMD's range right now, but I suspect virtually every discrete graphics card they sell is "2k optmized" by the standard Intel used here, and also doesn't really warrant explicit mention.


I'm baffled that PC gamers have decided that 1440p is the endgame for graphics. When I look at a 27-inch 1440p display, I see pixel edges everywhere. It's right at the edge of losing the visibility of individual pixels, since I can't perceive them at 27-inch 2160p, but not quite there yet for desktop distances.

Time marches on, and I become ever more separated from gaming PC enthusiasts.


Gaming at 2160p is just too expensive still, imo. You gotta pay more for your monitor, GPU and PSU. Then if you want side monitors that match in resolution, you're paying more for those as well.

You say PC gamers at the start of your comment and gaming PC enthusiasts at the end. These groups are not the same and I'd say the latter is largely doing ultrawide, 4k monitor or even 4k TV.

According to steam, 56% are on 1080p, 20% on 1440p and 4% on 2160p.

So gamers as a whole are still settled on 1080p, actually. Not everyone is rich.


The major drawback for PC gaming at 4k that I never see mentioned is how much heat the panels generate. Many of them generate so much heat that rely on active cooling! I bought a pair of high refresh 4k displays and combined with the PC, they raised my room to an uncomfortable temperature. I returned them for other reasons (hard to justify not returning them when I got laid off a week after purchasing them), but I've since made note of the wattage when scouting monitors.


>hard to justify not returning them when I got laid off a week after purchasing them

Ouch, had something similar happen to me before when I bought a VR headset and had to return it. Wishing you the best on your job search!


That was earlier this year. I found a new job with a pay raise so it turned out alright. Still miss my old team though.. we've been scattered like straws in the wind.


I'm still using a 50" 1080p (plasma!) television in my living room. It's close to 15 years old now. I've seen newer and bigger TVs many times at my friends house, but it's just not better enough that I can be bothered to upgrade.


Doesn't plasma have deep blacks and color reproduction similar to OLED? They're still very good displays, and being 15 years old means it probably pre-dates the SmartTV era.


Yes exactly, the color is really good and it's just a stupid monitor that has an old basic Chromecast and a Nintendo Switch plugged in.

It does have HDMI-CEC, so I haven't even used the remote control in several years.


Classic Plasma TVs are no joke. I’ve got a 720p Plasma TV that still gets the job done.

If you’re ok with the resolution, then the only downside is significant power consumption and lack of HDR support.


I recently recently upgraded my main monitor from 1440p x 144hz to 4K x 144hz (with lots of caveats) and I agree with your assessment. If I had not made significant compromises, it would have cost at least $500 to get a decent monitor, which most people are not willing to spend.

Even with this monitor, I'm barely able to run it with my (expensive, though older) graphics card, and the screen alarmingly flashes whenever I change any settings. It's stable, but this is not a simple plug-and-play configuration (mine requires two DP cables and fiddling with the menu + NVIDIA control panel).


Why do you need two DP cables? Is there not enough bandwidth in a single one? I use a 4k@60 display, which is the maximum my cheap Anker USB-C Hub can manage.


I'm not sure, but there's an in-depth exploration of the monitor here: https://tftcentral.co.uk/reviews/acer_nitro_xv273k.htm

Reddit also seems to have some people who have managed to get 144 with FreeSync, but I've only managed 120.

Funnily enough while I was typing this Netflix caused both my monitors to blackscreen (some sort of NVIDIA reset I think) and then come back. It's not totally stable!


This is likely a cable issue. Certain cable types can't handle 4k. I had to switch from DisplayPort to HDMI with a properly rated cable to get past this in the past.

It works up until too many pixels change, basically.


Had the same issue at 4k 60fps, it mostly worked but the screen flashed black from time to time. I used the thickest cable I had lying around and it has worked fine since.


If you're on an Nvidia 4000 series DisplayPort is limited to 1.4, ~26 Gigabit/s.

https://linustechtips.com/topic/729232-guide-to-display-cabl... is a calculator for bandwidth 4K@144 HDR is ~40 Gigabit/s. You can do better with compression, but I find Nvidia cards have an issue with compression enabled.


Interesting. I've been running my 4K monitor at 240Hz with HDR enabled for months and haven't had any issues with Display Stream Compression on my 4080.


For me it's a small issue with full screen (including borderless iirc) games causing the display to black out for a few seconds.

I don't think it's an issue until you notice. I only noticed because I toggle HDR for some games and at 1440p240hz, the difference is just enough to not need DSC


I don’t think that’s true anymore. I routinely find 4K/27” monitors for under $100 on Craigslist, and a 3080-equivalent is still good enough to play most games on med-high settings at 4K and ~90Hz, especially if DLSS is available.


Your hypothetical person has a 3080 but needs to crawl craigslist for a sub-100$ monitor? U guess those people exist, but idk why you'd bother with a 3080 to then buy a low refreh rate, high input latency, probably TN, low color accuracy craigslist runoff.

Could just get a 3060 and a nice 1440p monitor.


3080-equivalent performance can be had for a few hundred bucks these days, no?


Not rich. Well within reach for Americans with expendable income. Mid range 16" macbook pros are in the same price ballpark as 4k gaming rigs. Or put another way costs less than a vacation for two to a popular destination.


> You say PC gamers at the start of your comment and gaming PC enthusiasts at the end. These groups are not the same

Prove to me those aren't synonyms.


Prove to me they are.


I used to be in the '4k or bust' camp, but then I realized that I needed 1.5x scaling on a 27" display to have my UI at a comfy size. That put me right back at 1440p screen real estate and you had to deal with fractional scaling issues.

Instead, I bought a good 27" 1440p monitor, and you know what? I am not the discerning connoisseur of pixels that I thought I was. Honestly, it's fine.

I will hold out with this setup until I can get a 8k 144hz monitor and a gpu to drive it for a reasonable price. I expect that will take another decade or so.


I have a 4K 43" TV on my desk and it is about perfect for me for desktop use without scaling. For gaming, I tend to turn it down to 1080p because I like frames and don't want to pay up.

At 4K, it's like having 4 21" 1080p monitors. Haven't maximized or minimized a window in years. The sprawl is real.


MacOS and Windows have very good fractional scaling now. Linux... still no, but it's better than the X days.


I find the scaling situation with KDE is better with the Xorg X11 server than it is with Wayland. Things like Zoom will properly be scaled for me with the former.


Are you using fractional scaling, or 2x?


2x


It's true but I don't run into this issue often since most games and Windows will offer UI/Menu scaling without changing individual windows or the game itself.


> That put me right back at 1440p screen real estate and you had to deal with fractional scaling issues.

Get a 5K 27".

No fractional scaling, same real estate, much better picture.


I think it's less that gamers have decided it's the "endgame" and more that current gen games at good framerates at 4k require significantly more money than 1440p does, and at least to my eyes just running at native 1440p on a 1440p monitor looks much better than running an internal resolution of 1440p upscaled to 4k, even with DLSS/FSR - so just upgrading piecemeal isn't really a desirable option.

Most people don't have enough disposable income to make spending that extra amount a reasonable tradeoff (and continuing to spend on upgrades to keep up with their monitor on new games).


This is a trade-off with frame rates and rendering quality. When having to choose, most gamers prefer higher frame rate and rendering quality. With 4K, that becomes very expensive, if not impossible. 4K is 2.25 times the pixels of 1440p, which for example means you can get double the frame rate with 1440p using the same processing power and bandwidth.

In other words, the current tech just isn’t quite there yet, or not cheap enough.


Arguably 1440p is the sweet spot for gaming, but I love 4k monitors for the extra text sharpness. Fortunately DLSS and FSR upscaling are pretty good these days. At 4k, quality-mode upscaling gives you a native render resolution about 1440p, with image quality a little better and performance a little worse.

It’s a great way to have my cake and eat it too.


I don’t think it’s seen as the end game, it’s that if you want 120 fps (or 144, 165, or 240) without turning down your graphics settings you’re talking $1000+ GPUs plus a huge case and a couple hundreds watts higher on your power supply.

1440p hits a popular balance where it’s more pixels than 1080p but not so absurdly expensive or power hungry.

Eventually 4K might be reasonably affordable, but we’ll settle at 1440p for a while in the meantime like we did at 1080p (which is still plenty popular too).


You incorrectly presume that all gamers care about things such as "pixel edges". I think 1080p is fine. Game mechanics always trump fidelity.


That's more of a function of high end Nvidia gaming card prices and power consumption. PC gaming at large isn't about chasing high end graphics anyway, steam deck falls under that umbrella and so does a vast amount of multiplayer gaming that might have other priorities such as affordability or low latency/very high fps.


I don't see anybody thinking 1440p is "endgame," as opposed to a pretty nice compromise at the moment.


It's a nice compromise for semi competitive play. On 4k it'd be very expensive and most likely finicky to maintain high FPS.

Tbh now that I think about it I only really need resolution for general usage. For gaming I'm running everything but textures on low with min or max FOV depending on the game so it's not exactly aesthetic anyway. I more so need physical screen size so the heads are physically larger without shoving my face in it and refresh rate.


Its not the end game but most rather have more fps for high refresh rate on higher graphic setting than couple more pixel you won't notice unless you search for it and low fps and/or low graphic settings.


I find dual 24 inch 1440p q great compromise. Higher pixel density, decent amount of screen real estate, and nice to have an auxiliary monitor when gaming.

I run the second monitor off the IGPU so it doesn't even tax the main GPU.


4k gaming (2160p) is like watching 8k video on your TV.

It's doable, the tech is there. But the cost is WAY too high compared to what you get from it in the end.


How far away are you sitting from that monitor? Mine is about 3-4 feet away from my face and I have not had the same experience.


if you can see the pixels on a 27 inch 1440p display, you're just sitting too close to the screen lol


I don't directly see the pixels per se like on 1080p at 27-inch at desktop distances. But I see harsh edges in corners and text is not flawless like on 2160p.

Like I said, it's on the cusp of invisible pixels.


Gamers often use antialias settings to smooth out harsh edges, whereas an inconsistent frame rate will literally cost you a game victory in many fast-action games. Many esports professionals use low graphics settings for this reason.

I've not tried but I've heard that a butter-smooth 90, 120, or 300 FPS frame rate (that is also synchronized with the display) is really wonderful in many such games, and once you experience that you can't go back. On less powerful systems it then requires making a tradeoff with rendering quality and resolution.


Nvidia markets the 4060 as a 1080p card. It's design makes it worse at 1440p than past X060 cards too. Intel has XeSS to compete with DLSS and are reportedly coming out with their own frame gen competitor. $40-50 is a decent savings in the budget market especially if Intel's claims are to believed and it's actually faster than the 4060.


the 4060 is an "excellent" nvidia product where they managed to release something slower than the cheaper previous gen 3060


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: