Apart from programming, one of the motivations for getting the 8K display is to look at lidar point clouds. For example the desktop background in my post is a lidar map of Bernal Hill in San Francisco, which I've here downsampled to only 13006 x 7991 px for your convenience [1].
Admittedly, when I bought it at first, I didn't realize there would be so many random issues, as manufacturers all advertised their gear as "8K Ready" even in 2021. As I incrementally fixed the problems, I decided to document my journey in this blog post.
btw I posted this in the past but it got caught by the spam filter and disappeared [2], not sure how to appeal that when it happens. Thanks ingve for posting it again!
I had the Dell 8k monitor you mentioned, the picture quality was great but it died after a few years not long after the warranty expired (a gut punch at the purchase price) and they said too bad so sad... ok that's fine but I will never buy another Dell product again. It was released too early to have proper displayport support and I had to use a custom nvidia-driver X11 config to make it mostly work as two monitors. And there is basically no way to use that kind of DPI without scaling.
I replaced it with an LG 43UN700 which is a 43" 4K display that I use unscaled and although the LCD panel is vastly inferior I love the thing especially at the price point (under $700). I hope manufacturers continue to support this niche of large singular flat displays because they are fantastic for coding, data viewing/visualization and pitch hit at content consumption as your article states although this one would be no good for gaming. And getting a "monitor" or "professional display" firmware load means a lot less problems than a Smart TV load.
I had a similar experience with Dell after they wanted the price of a new laptop for a replacement laptop battery. This was for the Dell Studio back when battery packs were made to be swappable by simply sliding a latch.
After that phone call to customer support, I made a similar vow to never buy another Dell product. These days, I use a Framework laptop.
If Sir is buying his lithium batteries and/or power transformers from the likes of eBay, Alibaba and Amazon, then Sir may wish to check his fire insurance is up to date.
I have bought many third party rechargable batteries from those sites over the years. Yes, slightly lower charge capacity compared to original, but no fires. And, yes, I know my sample size is small!
I also had similar good experiences buying batteries on Aliexpress. The issue with those typically isn't intrinsic quality as the batteries are most of the time good but lack of quality control. Bad batteries will reach the market and this is specially dangerous with packs with many cells like e-bikes packs.
I did in fact buy a knock-off from ebay battery, but it kept it kept it's charge for hilariously little time. Had to run it of mains power permanently (ran it as a little server for a while).
Don't know your exact timing but I run basically on Dells Latitude laptops for past 2 decades. Since its just for travel and not my primary workhorse, I buy used corporate ones for pittance (cca 300$ for models worth 1500 few years before), and swap battery myself for another original OEM one, they cost less than 100$ from original manufacturer. Its just 2-3 philips screws and 1 cable, anybody can do it. They last just as much as advertised on new ones and don't degrade much even after few years.
Batteries (and ie chargers) are one of the things that's utterly idiotic to shop around on chinese portals. You literally always get what you pay for (or worse) and can't punch above this threshold.
This was mid 2010s and the laptop has long since bit the dust. IIRC this was a Dell Studio 15, but I recall checking eBay for old new stock with no luck, but it doesn't surprise me that the Dell Latitudes have lots of stock floating around ebay.
FWIW, it was the same (even at the enterprise level).
We had a commodity (local cloud) computing Dell infra in the mid 2010s and were constantly replacing/returning “simple stuff” (fans, support flanges, memory, NICs).
“Dude, you’re gettn’ a Dell” became—-nope, never again.
I feel like there are not good quality hardware company nowadays.
Dell: the land of motherboard dying and dog shit trackpads.
Asus: dead soldered RAM.
Most BIOS: too long to boot, it's fucking 2024 what is your BIOS doing it needs more than 2s to boot? It was taking the same time to boot 30 years ago.
Every mouse: double click problem due to wrong use of the actuators.
And every hardware company has to try to cram some badly designed software and require you to create an account.
Your trackpad comment brought back a memory of 6 of us in a conference room.
We all had the same OS (NT) to the same patch level, same trackpad config, and same model of Dell laptop and every _single_ trackpad felt different. They weren't strictly "defective", but just wildly disparate physical feels and responsiveness.
I will give shout-outs to: 4th gen Kindles (has physical buttons and lasts forever), first gen iPhone SE, and Microsoft Mobile Mouse 3600.
Why does it take that long to post? I've had multiple Ryzen 300-series motherboards, none of them take anywhere near that long to boot outside of using something like some server-grade HBA that has its own boot step.
I have no idea, but it's a known issue, memory training maybe? It's a gaming PC so nothing special going on, ROG HERO motherboard, 32GB DDR4 (4x8GB), GTX 1080Ti.
I haven't used it much in recent years, I built it for gaming but had kids a couple of years later, now I game on whatever is convenient in the small burts I get; which is also the reason I haven't bothered upgrading it.
I would tepidly recommend lenovo, they support the firmware for a long time and most things work. Warranty is what you decide to buy. Designs tend to be pretty serviceable but it varies in the models and over the years.
I stupidly updated my firmware on my ThinkPad 14 running Linux and that removed the perfectly working S3 sleep and gave me a non-working ridiculous S0x instead.
You may want to look at the Samsung 5K monitor. It can often be had for $700. The sharpness of text is beautiful, especially if you're using a mac since it's optimized at 218ppi to avoid scaling. But, it might be smaller than you want. Apple also makes one that is nearly identical, except for the price.
PS - I have seen Dell go downhill as well. I returned the last Dell laptop I bought. My wife was sitting next to me on the couch and her macbook had full wifi bars while the dell had one bar. I did some research and they were using a pretty cheap wifi controller and maybe also had poor antenna design. I ordered a ThinkPad for the same price and it was great.
Which Samsung are you talking about? We got some new screens at work, Samsung S9 something or other. 27", 5K, thunderbolt 4. As you say, the text is very sharp, and colors seem fine enough. But that's about it, and I would not recommend them at all.
The worst issue is that the viewing angles are ridiculously bad. I'm sitting at arm's length, and the borders are very dark and somewhat blurry. They're of course OK if I move and look at them straight on, but my 32", fairly old LG doesn't have this problem.
Another pain point is the fact that it cuts off the power supply and the USB peripherals plugged in when it goes to sleep. I couldn't figure any way of disabling this behavior. But if you leave your PC running and expecting to connect to it over a USB network adaptor or similar, you're gonna have a bad time.
Yes, that's the one (Samsung only has one 5K monitor). Most reviews I've seen have been pretty positive about it, but it's good to hear a dissenting opinion as well.
I believe the viewing angle problem you're talking about is due to the anti-glare finish. It's a trade off for sure (one that some people would not want). I assume that's why Apple offers their consumer 5K monitor with or without that finish.
Apple also has a Pro 5K monitor. LG also has one older 5K, and those are the only 5K monitors currently on the market.
It's true that the anti-glare works OK. By that, I mean that I've never thought about it, and now that you mention it, I realize it's a good thing since I never felt the need to complain (I don't usually hold back). The screen is also very sharp and doesn't exhibit the weird texture some anti-glare coatings used to have.
However, in that particular office, there are no strong sources of light that would shine directly on the screens, so it's hard to say how good it actually is, especially when comparing to other models. The screen can also get pretty bright, so it should be able to handle most lighting situations in an office.
I type this on a Dell U3223QE, on a black background, with two lamps right behind me. The lamps aren't very bright, but the room is fairly dark (it's still night here). I can see the glare if I pay attention to it (didn't notice it before reading your comment). This is a 32" screen, sitting at roughly the same distance as the Samsung, yet it doesn't exhibit the viewing angle issue at all.
I do know that having a brightish window behind me with this screen requires upping the brightness, or the glare would be a pain. Never tried the Samsung in that configuration.
Have had a similar experience with Dell. Have exclusively used new and second-hand Dell Laptops (Inspiron & Latitude) for the past fifteen years with no problems. Purchased a XPS 15 directly from Dell five months ago and the battery charging circuitry has fried itself. Support ticket has been open for 40+ days awating parts...
Can't answer for them, but: lots of us are older and need glasses. To really benefit from your preferred resolution would mean tiny fonts that give me eyestrain. It makes sense for a phone or tablet held close up; for a monitor a meter away it mainly just increases the expense at every level (including video ram and bandwidth). OTOH more area is worth spending more on.
While I would prefer to have a large and HiDPI display in the future, unscaled 4k was more economical and has fringe benefits of not needing special setup/handling. I lost $5-6k with my failed Dell and am hesitant to spend a lot again since it was supposed to be a decade purchase.
Thinking back the only other monitor I've ever lost was a Dell as well.. including 30 years of CRTs and CCFL LCDs never had any issues with other brands :(
I never owned a Dell screen, but I once had a Dell laptop and it was built like a tank. My brother had a 27" LG monitor from back when 27" was the biggest and best you could get, wasn't even 4k back then. It just died one day, probably the back light. I had a 24" CCFL monitor that actually never died, just got dimmer and dimmer every year, after about 5 years it was about half as bright as it was new.
Today I mostly use my 16" Macbook, which is quite close to being 4K. I really enjoy the HiDPI and the 120Hz refresh rate, makes it hard to use an external monitor since you can rarely get a HiDPI and high refresh montiro.
That's surprising given that Dell usually offers very good warranty on their monitors, at least to consumers. Was this a bussiness (B2B) purchase perhaps?
not my experience, but maybe for some. The big problem is the quality has gone way down hill in the past 10+ years, and the warranty periods are ridiculously small. TVs and monitors are all built (and warrantied) now like they should be replaced every 3-5 years.
In the early 4k era, whenever I saw a TV used as a monitor, the eye strain was high. It was too bright, too contrasty, and generally, the picture was not great for using for things like programming. In addition, many TVs would do not-so-great things to the picture, the worst being digitally sharpening the image (which resulted in e.g. a halo effect around small text).
This might not bother some people, but it bothers me a lot.
How are you finding the display compares to a real monitor? How do I buy TVs which I know won't do this sort of thing?
Many (most?) current TVs has either a game or PC mode which can be set on your HDMI to disable these "improvements".
I think this is primarily driven by console gamers to the benefits of PC users. Our needs align here.
If you check rtings.com they usually evaluate how good the TVs are as a monitor.
You might still have issues with local dimming etc. But that is the price of cheap. Better models works really good today.
I am using a really cheapo LG 43" 4K as a monitor. Properly adjusted it is usable. Would I like better? Yes. But it is worth the trade off. And there are only a few options for a "proper" 4K monitor at 43". I find that a little strange as it hits the sweet spot around 110ppi. I used to used dual monitors but I much prefer a (I know: comically) large screen.
Only real annoyance I have is that it does not turn off automatically like a real monitor using DPMS. This means that I have to turn it on using a button. It will turn off after 15 min if there is no signal. Like in the olden days.
Fortunately, modern TVs can disable the sharpening and contrast/saturation enhancements. Since I do a lot of photography and image processing, I am also extremely sensitive to oversharpened halos, so I was a bit worried about that at first --- but fortunately, that is 100% nonexistent once I applied the appropriate settings on my Samsung QN800A. See pics: [1] [2]
I also detest the miniled HDR that they have going on, which can cause bright things to glow, so I disabled that. Unfortunately my QN800A still have a bit of "full screen HDR", namely, that the whole screen may uniformly dim if it detects that the scene is dark. This means that sometimes when you have a black screen with a single cursor on it, it gets dark, and it becomes hard to see the cursor. This doesn't affect normal usage though, when the screen is at a constant brightness.
On a LG tv you can disable this with buying a ‘service remote’ from amazon and acces a special menu. ( look for TPC or GSR settings ). I don’t know about samsung though.
I am using a 43 inch 4k monitor so I am allin on big screen real estate. But I find that even with a quarter of your screen area, I struggle to read the corners of the screen, the bottom is often obstructed by whatever is lying on my desk, and I had to make the mouse cursor bigger as I kept losing it. I doubt that an even bigger screen would be practical. I do have two 43in monitors side by side but the other one is more like a secondary screen for playing movies or "storing windows", it's too far from the eye to be useful as a primary monitor for reading and writing.
32" being (IMHO) too small and 43" too large, I have invested in a rare (and relatively expensive) 38" 4K monitor (Asus/ROG Swift PG38UQ, ~1000€ when I bought it, hasn't gone down much since), and until now I can only say good things about it. It's big enough to use without scaling (except for a few websites with tiny font size), but small enough so you can have a reasonable distance between the bottom and the desk and still see all four corners without craning your head around too much. It has a fixed foot (only the vertical angle is adjustable), and I originally thought I would have to buy an extra fully adjustable monitor stand, but so far I'm happy with the "default" settings. I'm not getting another one because of space limitations (and spouse tolerance issues), but compared to the two WUXGA monitors I had before, it's already almost four times as much screen space, so that should be enough for the foreseeable future.
Spouse tolerance issues are a significant factor - I have 2 x 32" and my spouse got used to it only after grudging about it for about a year :) If I brought a 43" one, I'm afraid a new apartment with a separate work room would have to come too.
I had a similar experience using a 43" 4K TV as my monitor, it was an OLED so the picture was absolutely beautiful but I'd end up only using the 32" in the middle of the display. I'm now using a 32" 4K display on my desk which is about the sweet spot for me, lots of real estate, and I can see all of it.
I also have a 43" 4K monitor, and I find myself being in the same position as you. The left/right edges are difficult to see. I don't have the issue with the mouse, but I also doubt whether a larger screen would be useful to me. As it is there's a corner that gets unused because it's just out of "eye shot" if that's a phrase. It is now, I guess :D
I use glasses (myopia) and can kind of tolerate the edges of my 32" 4k monitor, but I can't fathom craning my neck all the way up to the edges of a 55"+ display. Not to mention font sizes.
I have fairly bad eyesight with both myopia and astigmatism (-5 sph, -2 cyl) and I wear glasses. I got glasses with 1.71 index lenses, which I greatly prefer over the more common 1.74 index lenses due to the higher Abbe number, resulting in less chromatic aberration.
Anyway, I use browsers at 150% scaling usually, although the text is finer on my terminals. I don't use any scaling for UI elements and terminals. Using the i3 tiling window manager, I put more commonly used terminals on the bottom half of the screen since I find that the top half does require more neck craning.
FWIW there are lenses that are high index while still having a higher Abbe number, but they're expensive and pretty specific materials. Interesting that 1.74 are more common where you are, where I am lower index polycarb are the standard (sadly)
I had a 55" TV as my main display in 2022. Had it about a foot away from my face. It takes a few days, but your brain and body get used to the size.
I just bought a 39" ultrawide and for the first few days I thought "oh dear, I have to keep turning to see the whole thing," but I've not even thought about it for a couple of weeks now, so I guess I'm acclimated.
I have been using a 32" monitor for the last 10 years. I have found that I am using mostly the center of the monitor. The peripheral edges remain unused.
If I sit far from the monitor, then the FOV could be reduced, but then I have to increase the font size defeating the very purpose of maximizing screen real estate.
This is pretty much what I concluded as well after using my 43" 4K LG monitor for about 3 years. Lately I've been trying out my wife's 27" Apple Studio Display. It's smaller but the PPI is amazing...
You don't maximize windows except to watch videos at that size. It's more like having multiple monitors with fluid borders. You focus as needed, leaving the rest in your peripheral vision. That said I did miss maximizing windows to focus on tasks.
I use a combination of Aquasnap's magnetic border feature with MS Power Toys hotkeys and it has been a treat. Still room for improvement tho', esp. if I can force specific browser tabs into particular windows based on purpose.
Nice to see other people doing the same thing I do, albeit with a 4k OLED instead. I am waiting for an 8k OLED at an affordable price but it seems I will have to continue waiting.
What brand and model of desk do you have? I have a 48" TV but I sit rather close so it probably takes up the same field of view as your 65".
As to your last paragraph, if you email hn@ycombinator.com and explain the situation, they'll sort you out and sometimes put you into a second chance pool, as it's called.
I wish deep desks were more common! Modern ultrawide curved monitors sit way too close for comfort for me due to the way their legs have to be angled further back for center of gravity. custom desks end up being so expensive.
That's where a desk mount monitor arm comes in. Even a high-end model capable to hold those 49" 32:9 monsters will likely be significantly cheaper than a custom desk.
i'm using a nice sheet of 4x8 finished plywood from the hardware store. i trimmed the depth down a bit, but not much. put some edge banding on it, and stick it on top of a flexispot or whatever other 4-legged desk frame you want to use.
It's a dense hardwood, near the top of those attributes on wood scales.
> Your suggestion seems oddly specific.
Hardwoods make great table tops, I've always had jarrah workbenches and general desktops and used other woods for 'fancy' tables .. but then I'm in W.Australia and used to recover treefall for slabbing in sheds and using in Brady Drums etc. (I knew Chris Brady back in the day https://www.youtube.com/watch?v=55SXxWz0Vpg)
> Is this available outside of Australia?
Significant tonnages of it were blocked, shipped to England as ballast and used to pave the streets - as a consequence quantities are still kicking about the UK after being recovered and repurposed.
How did you get it in a custom dimension? I'm almost tempted to just put two of my current desk back to back to make it deeper, would probably be much cheaper than 2k, but then again, they're not standing desks.
In case you're wondering whether this works on a Mac, like I did, I found this source[0]. In short, you need an M2 Pro or better, and may need to edit a plist file to get it to work.
A tad surprised that curvature isn't discussed? With such a massive screen, the distance from your eye to the middle of the screen, and eye to the corners, are very different - unless you sit far away. Your eyes thus need to change focus all the time. That's AFAIK why those ultra wide screens are curved - and I find that the more curve they have (smaller radius), the better it is. With such a massive screen, I guess it would be best if it was part of a sphere! (Curved both ways)
I recently acquired a 43" 4K monitor for programming - a very boring Philips monitor, used at 100% scale. I hated it at first, but after a month I loved it.
2160p actual 'workspace' resolution at this distance (2 feet?) and size (43") seems close to a practical limit for typical use I thought, requiring with this measly 43" still a little bit of occasional head movement to see the top right corner. I noticed a tendency to sit slightly to the left of centre on this monitor, to avoid distortion and maintain clarity with what I'm focusing on (e.g. code/windows, not reference materials). Because of this I suspect at this distance a 43" with a slight curve would be optimal, at least for me.
What I wanted to ask you:
- What is your 'workspace' resolution? Is it something like 6K? I'm guessing your scaling is either 125% or 150%? Your PPI should be around 135, mine 102.
- Are you actually sat perfectly centre? I was wondering this because I keep noticing I tend to gradually shift my keyboard to the left over a day. Maybe this is years of 1440p + side portrait monitor use, I'm not sure, but eventually I accepted that I prefer slightly to the left (odd because my side portrait was on the left...)
- Do you think a curved monitor at this size/distance would improve the ergonomics? I imagine you must get a bit of a neck workout.
After getting this monitor, I'm pretty much sold on single screens again - but I had to switch my window management from keyboard-based tiling shortcuts to 'hold CTRL and move mouse' window management (BetterTouchTool on MacOS), with a tendency to stack up windows messily. I tried custom resize snap zones with BetterSnapTool - but I don't use them. I think that was the biggest challenge to switch from multi monitor to large format. It's a huge benefit to have everything in your context on one screen, but had to rethink how windows get moved around. Now I'm used to it, I want CTRL/SHIFT + mousemove modifiers on every system to deal with windows.
Also related, I bought a 4K tv last weekend for another system to use as a monitor, but found that the gaps between the pixels were unexpectedly large, creating a strange optical effect at close distance, making it unusable (but so close). There might be something different about the screen outer layer (on most TVs?) that polarizes light in a way better suited for distance viewing, but clearly not all TVs have this issue.
The power is not so bad, especially compared to the graphics cards you would want to use (and I use my GPU as a tow warmer). Samsung 8k specifically comes with low power presets which are probably usable in this scenario. Of course with so many more pixels in 8k than in 4k there is need for more power but the EU regulation allows selling them if they also support an eco mode.
I am old enough to recall 100W as the typical single light bulb and I still use an electric tea kettle that touches the multi kW range daily.
> I am old enough to recall 100W as the typical single light bulb
I'm regularly in a museum where they showcase some of the 1800s/1900s wares of households in my area. One is a kerosene/gasoline powered clothes iron. Just because something was one common doesn't mean it was good.
> I still use an electric tea kettle that touches the multi kW range daily
How many hours a day is your tea kettle actually using multi-kW? The more useful comparison is how many kWh/day are these appliances using.
Fair enough. Though I didnt live during kerosene times. My tea kettle uses 0.06kWh per session. So one to two weeks of tea for me to reach the energy of a day on such a monitor (see other comment on best guesses on energy use of this monitor). On the other hand, a typical pasta recipe on an electric stove would be 2kWh, so several days of use of such a monitor.
I realize my previous comment might have been a bit more adversarial than expected. Sorry if you took it that way.
And yeah as your comment shows it's really kind of an odd comparison to make in the end. Ultimately I'm of the mind that if the 8K screen really gives you a lot of value then it's probably worth it. You're dealing with that energy cost, and ultimately it's up to society to properly price externalities into the energy costs. You can make the decision whether the energy costs are really offset by the extra value you get.
But like, an 8K screen does use a considerable amount more energy than say a 4K. For a bit back in the day people really started to care about energy use of CRTs as they kept getting bigger and fancier. Then LCDs came out and slashed that energy usage massively compared to an equivalent size. Practically negligible compared to what a decent workstation would use. Now we're finally back to the screen itself using a pretty big fraction of energy use, and IMO consumers should pay attention to it. It's not nothing, it's probably not the single biggest energy use in their home, but it might be around the same as other subscriptions they're questioning in a given month.
And yeah, in the end I think that energy metric should be based on how many kWh you end up using on it in a month or whatever your billing cycle is. Compare it to the value you get for other things. Is it worth a week of tea to run it for a day, cost-wise?
I had a period of time where I bought a car for $3k. I then valued every big ticket thing to the utility I got from a whole car. "That's like .75 Accords, is that worth it?" Kind of an odd way of looking at things but really put a lot of value into perspective.
The eco mode is not usable, it's the manufacturer's way around a ban of 8k monitors. These monitors use at least twice what other monitors of the same size use, sometimes it's four times as much. And these measurements are probably in eco mode, so it could be worse.
> I am old enough to recall 100W as the typical single light bulb and I still use an electric tea kettle that touches the multi kW range daily.
Not sure why you mention this here? Just because we had horribly inefficient light bulbs our monitors can use twice as much?
I’m guilty of this as well. Folks of a certain age will always tend to measure energy consumption in “light bulbs.”
Sort of like how Americans always measure size in “football (gridiron) fields.”
The energy consumption of a traditional incandescent bulb, while obviously inexact, in nonetheless somewhat of a useful rough relative measurement. It is a power draw that is insignificant enough that we don’t mind running a few of them simultaneously when needed, yet significant enough that we recognize they ought to be turned off when not needed.
I always turn my monitors to the lowest possible brightness for long work sessions, so I assumed (perhaps mistakenly) that this eco mode would already be close to my settings out of the box, and if anything, too bright. Assuming 20c per kWh (California rates, mostly solar during the day) and one kWh per day (8h at 130kW average use), much higher than the allowed EU limit and the eco mode, the monetary cost comes down to about 4 USD per month. So definitely not negligible but also not a reason to avoid being able to tile 64 terminals if one wanted to do that.
[edit: the above estimate is almost certainly an upper bound on the energy I would ever use myself with such an item; I would be curious to measure it properly if/when I upgrade to one, and curious if the OP has a measure of their own usage. My best guess is that in practice I would average between 2 and 3 kWh every week (2 USD/month) rather than 5 kWh, because I tend to work in low light.]
Yeah it does emit a bit of heat. I think around one or two hundred watts? I haven't measured it directly. I have a mini split air conditioner in my home office.
The comment above has very wrong numbers, by the way, typical consumption for the whole device should around or less than what that poster claims is drawn just by the CPU!
What zoom (if any) do you typically run at? For instance, a 200% zoom would give you an effective resolution of 4K, but with much sharper and smoother text and rendered graphics.
I tried this a couple of years ago and had to ditch the TV because of too much input lag.
You mention input lag only once where you say:
> Although this post is mostly focused on productivity, most if not all 8K TVs can be run in 4K at 120 Hz. Modern TVs have decent input lag in the ballpark of 10 ms and may support FreeSync.
Have you measure this, or where do you get this number from?
The TV I bought was also advertised as low-latency, but I found it too high in practice (when typing in code, or scrolling, and waiting for the screen to update).
I don't know many Linux users doing 4k+ at 144hz. I am wondering if you do any screen capture or desktop recording, and if so what software you use and what your experience is like? I cannot reliably capture 4k/144hz with my setup but my desktop environment is still on X11. I tried KDE/Wayland and had a better experience, but run into other bugs based on their integration.
Just curious how your experience with sway has been. I installed it but wasn't expecting to come with no config at all and didn't really want to be bothered setting it up just to test screen recording.
The issue with X11 is that even if you record (using any software) it causes the display refresh rate to artificially drop and its a very bad experience overall when you run at 4k144hz. Ultimately, the future is wayland but I am a little surprised how slow it has been for everyone to integrate it into their software.
Yes. It makes the experience much better when anything is moving. Hard to convince by words, it’s a try it and then go back to 60 to see what you’re missing.
Similar to hard drive vs SSD. Before I used a machine with a SSD for the first time hard drives were fine, then my normal was conditioned to that of SSD speeds. Going back to hard drive speeds is painful, just like 60hz even for things like moving windows around the desktop.
Seems to be a rather subjective thing. Going from 4k to 1080p literally causes me headaches, going from 240Hz to 60Hz feels normal after a minute or two. Yes, it feels nicer, but that's it for the most part. Not something that makes me want to update the screen right now.
Not something that makes me want to update the screen right now
That about sums it up.
I alternate between 120hz and 60hz monitors depending on where I’m working.
For software engineers: 120hz is “nicer”, and if you are buying a monitor today I’d say it’s well worth it to pay another $100 or whatever to go for 120hz+ versus 60hz. Certainly not worth upgrading for this alone however.
For designers: Same as above but it perhaps leans a little closer to being worth the upgrade. The mouse cursor is noticeably much moother and if you’re doing digital painting or something all day, 120hz+ might really be worth the upgrade all by itself if budget allows. Working with the 120hz (or is it 240hz now?) stylus on iPad Pros is revelatory for that kind of work.
For gaming: For any fast action gaming (for games and platforms that support high frame rates) it really is worth the immediate upgrade. Your world now looks and feels fluid. It feels real. Input lag is usually halved as well.
They are useful for the same reason response rate is important -- motion blur and judder. Things look more crisp and move more fluidly across the screen.
Its slow because there is no singular Wayland just 12 different waylands that diverge because the primary standard is underspecified and took 16 years for people to agree on functionality people agreed was needed in 1999.
What you really need to match is the angular resolution in microradians from your eye. You can make any screen smaller by sitting farther back. That said, I do wish my TV was only 42". I guess if you really want the ppi to be exactly the same as a 27" 5K screen, then 27 * 7680 / 5120 = 40.5".
This is exactly the reason I intend to stick with 4k for now: I don't want a display that large. I currently have a 48" 4k display, and I'd prefer to have a 42" or 36" one. (Good choices are hard to find, though, particularly if you actually want 4k rather than ultrawide, want OLED, and don't want to just use a TV.)
I bought the Philips Evnia which fits perfectly into that category at 42". Despite being a gaming monitor it's not garish and I've grown to love the ambilight.
Do you use macOS with this and if yes, do you often share your screen? I find large monitor unusable for screen sharing on macOS in general as it will share a lot of blank space and the window you want to share, making the window minuscule for anyone that does not have an 8k monitor like you.
Interesting. Time to buy a new tv or monitor for programming. Wonder which resolution and size to go. Use 4K 27 for programming and a super wide for my fs2020.
Btw I would use two different glass when I use it as tv or playing fs2020/4 vs when I sit close to use it as programming station.
hey, i asked you on the other thread as well (the imac one) but this was my question
—-
Hey, I have a similar setup (https://kayg.org/uses) where I use a LG C148 as my primary TV and monitor. I do all work on it, however I am unable to use tiling window managers as you recommend because I always struggle to see windows / text that is placed above my eye-level.
For that reason, I prefer to use manual window management solutions instead.
I am curious how do you deal with that problem, one big TV user to another? or do you not have that problem at all?
thanks!
I did this same thing with a 50" 4k TV ... I get and it does work .... My biggest issue is the tv brightness levels even at low were waaaaaaay too bright... I was using LCD ... Is oled better for this????
> There is also a Dell UP3218K, but it costs the same as an 8K TV and is much smaller and has many problems. So I do not recommend it unless you really don’t have the desk space. Sitting further back from a bigger screen provides the same field of view as sitting close to a smaller display, and may have less eye strain.
I've recently swapped out my dumb TV with a smart TV. The choice to go smart after clinging on to my dumb TV + old-school Chromecast was only motivated by advances in display tech. In retrospect the smart TV is a considerably worse experience UX-wise than the dumb TV + Chromecast. The built-in Chromecast in the new TV requires the TV to be logged into accounts for all the "apps" that the TV has. I can no longer just cast something from any device connected to my network and have it "just work" like it did before.
I know in this case you're working with HDMI and hopefully have managed to set the TV up to just display an HDMI output on bootup, but did you run into any of these infuriating "smart" TV things?
I was previously working at a lidar company and now I am working at a robotics company providing calibration and localization software to customers using a combination of lidars, cameras, and other sensors.
You COMPLETELY missed the elephant in the room : 8K TVs have really, really massive CPUs that waste a TON of power (150-200w for the CPU, 300-400w for the TV, often!) Think 8 cores of the fastest arm 64-bit processors available plus extra hardware accelerators! They need this extra processing power to handle the 8K television load, such as upscaling and color transforms - which never happen when you are using them as a monitor!
So, 8K TVs are a big energy-suck! There's a reason why European regulations banned 100% of 8K TVs until the manufacturers undoubtedly paid for a loophole, and now 8K TVs in Europe are shipped in a super-power-saver mode where they consume just barely below the maximum standard amount of power (90w) ... but nobody leaves them in this mode because they look horrible and dim!
If everybody were to upgrade to an 8K TV tomorrow, then I think it would throw away all the progress we've made on Global Warming for the past 20 years ...
Anecdotally my house draws 0.4 kW when idle and 0.6-0.7 kW when both my 8K screen and my computer are on. Since my computer draws 0.1-0.2 kW, I surmise that the QN800A doesn't draw 300-400 W total --- maybe 100-200 W.
I run my screen on a brightness setting of 21 (out of 50) which is still quite legible during the day next to a window.
Also, I have solar panels for my house (which is why I'm able to see the total power usage of my house).
The parent comment is completely wrong on nearly every point it makes. I don't know why it's so upvoted right now.
It doesn't even pass the common sense test. Does anyone really think TVs have 200W CPUs inside just to move pixels around? That's into the territory of a high-end GPU or server CPU. You don't need that much power to move some pixels to a display.
I didn't smell anything. A 200W PSU isn't terribly expensive and being cheaper than more efficient processors seems reasonable. I also only run a single 4k monitor so haven't thought about driving 4x the pixels recently.
That's a facially absurd statement. Just on the numbers:
The US consumes 500 gigawatts on average, or 5000 watts per household.
So if every household bought an 8K TV, turned it on literally 100% of the time, and didn't reduce their use of their old TV, it would represent a 10% increase in power consumption.
The carbon emissions from residential power generation have approximately halved in the past 20 years. So even with the wildest assumptions, it doesn't "throw away all the progress we've made on Global Warming for the past 20 years ...".
To put it in perspective, an electric car might need 350 Watt-hours per mile. A 10-mile drive would use 3.5 kWh. That's equivalent to about 24 hours of using that monitor at normal settings, or about 8 hours at maximum brightness.
The comparison doesn't make sense, though, because if you drove to the office you'd still be using a monitor somewhere. A single 4K monitor might take around 30-40W. Using four of them to equal this 8K display would come in right around the 139W typical power consumption of the 8K 65" monitor.
There's no "fixed budget" of energy that is ethically ok to use. The parents point was that these devices are woefully inefficent no matter which way you look at them.
The "best" thing to do would be neither, and is usually to just use the device you have - particularly for low power electronics as the impact of buying a new one is more than the impact of actually running the thing unless you run it 24/7/365
> There's no "fixed budget" of energy that is ethically ok to use.
Not even 0.00001 W? How is it ethical to live in the first place in such case?
> The parents point was that these devices are woefully inefficent no matter which way you look at them.
It's always a trade off, of productivity, enjoyment vs energy efficiency, isn't it?
If I find a setup that allows me to be more productive and enjoy my work more, certainly I would need to balance it with how much potential waste there is in terms of efficiency.
> The "best" thing to do would be neither, and is usually to just use the device you have
That's quite a generic statement. If my device is a budget android phone, do you expect me to keep coding on it, not buying better tools?
> You COMPLETELY missed the elephant in the room : 8K TVs have really, really massive CPUs that waste a TON of power (150-200w for the CPU, 300-400w for the TV, often!)
RTINGS measured the Samsung QN800A as consuming 139W typical, with a peak of 429W.
Your numbers aren't even close to accurate. 8K TVs do not have 200W CPUs inside. The entire Samsung QN800A uses less power during normal operation than you're claiming the CPU does. You do not need as much power as a mid-range GPU to move pixels from HDMI to a display.
> There's a reason why European regulations banned 100% of 8K TVs
This is also incorrect. European regulations required the default settings, out of the box, to hit a certain energy target.
So large TVs in Europe (8K or otherwise) need to come with their brightness turned down by default. You open the box, set it up, and then turn the brightness to the setting you want.
> until the manufacturers undoubtedly paid for a loophole
This is unfounded conspiracy theory that is also incorrect. Nobody paid for a loophole. The original law was written for out-of-the-box settings. Manufacturers complied with the law. No bribes or conspiracies.
> If everybody were to upgrade to an 8K TV tomorrow, then I think it would throw away all the progress we've made on Global Warming for the past 20 years ...
The Samsung QN800A 8K TV the author uses, even on high settings, uses incrementally more power than other big screen TVs. The difference is about equal to an old incandescent lightbulb or two. Even if everyone on Earth swapped their TV for a 65" 8K TV tomorrow (lol) it would not set back 20 years of global warming.
This comment is so full of incorrect information and exaggerations that I can't believe it's one of the more upvoted comments here.
> RTINGS measured the Samsung QN800A as consuming 139W typical, with a peak of 429W.
Can you explain why does a TV's power fluctuate so much? What does peak load look like for a TV? Does watching NFL draw more power than playing Factorio?
Power consumption varies significantly based on what's being displayed, on top of brightness settings.
I have a 42" 4k LG OLED. With a pure black background and just a taskbar visible (5% of screen), the TV draws ~40W because OLED pixels use no power when displaying black.
Opening Chrome to Google's homepage in light mode pulls ~150W since each pixel's RGB components need power to produce white across most of the screen.
Video content causes continuous power fluctuation as each frame is rendered. Dark frames use less power (more pixels off/dim), bright frames use more (more pixels on/bright).
Modern OLEDs use Pulse Width Modulation (PWM) for brightness control - pixels switch rapidly between fully on and off states. Lower brightness means pixels spend more time in their off state during each cycle.
The QN800A's local dimming helps reduce power in dark scenes by dimming zones of the LED backlight array, though power consumption still varies significantly with content brightness. It's similar to OLED but the backlight zones are not specific to each pixel.
Dark mode UIs and lower brightness settings will reduce power draw on both QLED and OLED displays.
Traditional LCDs without local dimming work quite differently - their constant backlight means only brightness settings affect power, not the content being displayed.
This explains those power fluctuations in the QN800A measurements. Peak power (429W) likely occurs during bright, high-contrast scenes - think NFL games during a sunny day game, or HDR content with bright highlights. For gaming, power draw is largely influenced by the content being displayed - so a game like Factorio, with its darker UI and industrial scenes, would typically draw less power than games with bright, sunny environments.
I was under the incorrect impression the power consumption would related to the rendering of the image (ala CPU/GPU work). Having it related to brightness makes much more sense.
To be fair it's not the energy that you're concerned with; it's the source of that energy.
Private jets can't run off nuclear power grids. Also the real problem-child of emissions is not America. China has a billion more people, what are their TVs like?
Good points. I would go further and say it is the integral of emissions over time that we would be most concerned with. From that perspective, over the last 200 years - it is problem children, and rising problem childs.
> The average American household uses about 29 kilowatts of power per day (29,000 megawatts).
Ignoring the megawatts error that the sibling pointed out, it's 29 kilowatt hours per day. Watts are a unit of power consumption -- joules (energy) per second.
One kilowatt hour is the energy used by running something at 1,000 Watts for one hour.
> The bezels and gaps in between the monitors introduce distractions and one is limited in how one may arrange terminals and windows across multiple displays.
To me, the segmentation is a feature. It lets me offload information density and focus. For example, I commonly have an editor on one screen, a browser on the second, and something like a chat app, terminal, etc on the laptop screen.
Nobody's stopping you from segmenting one big monitor into different regions; and you get to choose how big those regions are from day to day rather than being forced into it.
They tend to be relatively poorly handled by the software, at least out of the box.
Every modern major OS now has some level of tiling/splitting on a monitor's edges baked into their window manager by default now. Some can be tweaked to split into smaller subgroups, but that often requires less well tested/polished options (some apps just ignore the hints), or even third party extensions.
That's too much extra work. With multiple monitors you can maximize primary apps while still having manual management of smaller supporting apps on another monitor. You also get more edges for rapid snap to the sides of a monitor.
Since you seem to know about the best window managers, can you recommend one for MacOS which will let me direct focus to whichever window is left/right/down/up of the currently selected one? i3/sway does this just fine, but my impression is that MacOS's api doesn't allow third party developers to pull it off, but I'd love to be wrong about that.
Not the person you were asking, but after years of using i3, AeroSpace is the only way I can use a Mac productively, and does indeed have the feature you're describing.
Wow. Thanks for this. I've bounced off a number of tiling WMs on MacOS over the years - Amethyst, Yabai, others I can't remember - but Aerospace is really excellent. Can't believe I've never heard of it before. Love the custom implementation of spaces as a solution to what ails a number of other tiling WMs. I installed it this morning and disabled the mish-mash of Rectangle Pro, Better Touch Tool and OS kb shortcuts I'd been using.
it has quirks and limitations, some of which can be fixed by disabling system integrity protection but it can definitely handle window tiling and navigating with keybindings when you use the companion daemon https://github.com/koekeishiya/skhd
I use yabai which does what you say and more pretty well. It also lets you completely remove spaces transition effect but this will require disabling of SIP.
Although if that big monitor is an OLED, segmenting it into halves or quarters is kind of begging to end up with a line burned in down or across the middle eventually.
Samsung solves this in the TV itself. It can be annoying when the edges of the screen are ever so slightly off, but i'm glad I don't have to worry about it. QCQ90S. I wouldn't recommend it since the tv's gui is glacially slow, but then again all the ones I tried last year were.
I have done this in the past using a tilling window manager and it's still better to use different displays. There is something about our monkey brains that makes 'different physical object = do different things' work better than all having it on the same monitor.
I did get it to work for me with thick black bars between the screens, but when you're giving up an inch of screen real estate for every virtual monitor then you might as well get physical ones.
I use a single ultrawide at home and dual-monitors at work.
Initially thought having one monitor experience was more seamless, but I do miss implicit window organizational aspect that dual monitors provide. And screensharing on the ultra-wide is a pain.
If your ultrawide is anything like mine, it also has a setting that lets it register as two separate monitors (PIP/PBP mode), which is like having two monitors without the bezel, but with the convenience of "there's an edge" in the middle of your screen when doing regular desktop work.
Does require two cables of course, but if you're driving an ultrawide, you're probably using a graphics card with three or four outputs anyway.
My Samsung ultra wide has side by side mode with two input cables. Screen sharing (and Windows) thinks it’s two monitors but I can stretch windows all the way across both if I want to since it is an extended set
Best of both worlds, I wish there were a way to configure this within the OS so that you could make a single screen appear like 2, 3, or 4 logical screens.
A decent window management tool (e.g. Rectangle.app) should resolve most of your window management issues - set up many drag points to easily divide windows by half, thirds, quarters, sixths, etc.
Most screen share apps should support sharing by window. Also best for privacy (so your viewers don't see the side channel chat notifications pop up).
Also an ultrawide monitor is preferable for spreadsheet warriors.
I will not give up my 49" 21x9 for anything lesser.
Same, I've never liked spanning a window across multiple monitors. The discontinuity of the bezel is a handy mental break. Often I'll have email and teams on one screen and my main item of work on the central screen.
Same. This utility is also multiplied by having a separate set of virtual desktops on each display, which lets one create sets of windows/apps that can be mix-matched between screens, reducing the amount of window-shuffling to almost nothing after initial setup.
This is only possible under macOS and Linux, unfortunately. On Windows virtual desktops are still kind of a weird hack that spans one desktop across all monitors.
Yeah that issue seems weird to me, because I've never found bezels themselves to be that much of a problem. Like sure, less bezel is better. But I have some pretty wide gaps in my work monitors, and I've never found it to be a problem.
This article, and a lot of "productivity" articles, feel like spending a lot of time and effort for marginal-at-best improvements. I don't know their specific workflow, but I'm pretty sure they could get basically the same amount of productivity with a handful of 1080p monitors.
Its not a problem until you want to watch a movie or play a game, then you have a black bar down the middle. Compared with the opposite or just having one screen to split as youd like virtually.
After 15 years of having a desk job I find that I’m more sensitive to the position I sit in. My back feels a lot better if I have a single, regular sized screen right in front of me, instead of having additional screen estate on the sides or below (as with a laptop).
At the same time I use virtual desktops that I can switch with both keyboard and mouse.
The general advice is to have top of monitor at eye level, but it's been wrong advice for me personally. I now put the middle of the monitor at eye level. Keeps my head up and posture better. Leaning back instead of stooping.
The general advice provided to me, and relayed by me is eyes centered @ 2/3th of the screen.
The best advice received and relayed by me regarding posture might surprise you.
If you struggle with posture, stop caring about what other people might think about your posture. Changing/Tweaking posture all the time might look bad, but it also tends to mitigate the effects of being frozen in bad posture(!) The health impact is too significant to ignore.
Yeah I think the only ergonomic advice I believe anymore is that there does not exist a position that is ergonomic to sustain for more than a couple hours. Humans are not evolved to stay stationary, few mammals are really.
I do this too, though mostly out of necessity. I use a 27" screen a couple feet away. To get the top of the monitor level with my eyes I'd either have to lower it so the bottom of the monitor was almost flush with the desk (which my current monitor's stand won't do anyway), or get a taller chair/lower my desk, both of which would leave my legs rubbing up against the desk underside and my arms at an uncomfortable angle for typing.
Either I have an abnormally short torso, or that advice was written back when most people were using a 14" display.
Indeed. AIUI your head needs to be back, chin tucked in, which means looking down a bit. If you're looking level or up you're going to be sticking your head out a bit
I'm the same. I use a single 27" 4k monitor and use virtual desktops. The best upgrade for me though was getting a computer prescription for some glasses that I keep on my desk.
Sometimes I think about upgrading to a 5k monitor. The Apple Studio Display looks great, but I'm a Windows user and I'm guessing a lot of the nice features of that display are Mac-only.
There aren't a whole lot of options for 5k monitors. Other than Apple I think there's a Dell, but it's too wide. There's a Samsung but I've been burned by Samsung too many times. There's also an LG 5k monitor but it gets pretty weak reviews.
> The Apple Studio Display looks great, but I'm a Windows user and I'm guessing a lot of the nice features of that display are Mac-only
I can possibly be of some help here. I have a Studio Display, however my work-provided machine is a Dell laptop and so that is what is connected to it most of the time.
Providing your machine can output video via Thunderbolt or USB-C, it will work. That is fairly common these days, though Windows machines capable of driving a 5120x2880 signal can be harder to come across, particularly in the corporate laptop world, though I don't know how much of a concern that is to you.
My last work machine maxed out at 4K which the Studio Display would happily scale up to full screen. I would describe it as substantially sharper than e.g. a 2560x1440 display of equivalent size, but still noticeably less sharp than the full native 5K (obviously). My current machine can do the full 5K, but the performance leaves a lot to be desired (however the thing is a turd anyway, too much corporate security crap bogging it down).
Speakers, camera, and microphone built into the display all work totally fine from Windows. What may be a total non-starter is that you need a Mac or iPad to change the brightness, because there's no physical controls on the display itself and Windows doesn't expose a way to control it. I am lucky/unlucky in that my home office does not get a huge amount of natural light, meaning I've been able to set it to a comfortable brightness from my Mac and then just leave it.
Overall it's a very nice monitor if you can work around the brightness thing. A possibly better contender though is the recent-ish 5K variant of the Asus ProArt[0]. I was using the 1440p version of the same monitor before I got the Studio Display, and I was very happy with it. Good colour reproduction, USB-C Power Delivery for one-cable laptop docking, and a far more adjustable stand than the SD. Worth a look.
I've got the LG 5K and it's been totally dependably kick ass for the 4 years (i think) since I got it (from the Apple Store). Mostly using it on macOS but have used it with Windows and haven't tried with Linux.
There are several 5k to appear next year: Benq, ALogic, maybe somrthing else. There are also chinese noname 5k monitors which use panel rejects of ASD.
Agreed. To each their own, but the obsession with the biggest and/or most possible screens is something that is very hard for me to relate to. As soon as I am regularly craning my neck to see all of my screen real estate, it is no longer a positive in my life. I'm glad these solutions exist for people who enjoy them, but they are definitely not for me.
Same here. I only use and want a single monitor setup. I can alt-tab between windows faster and more comfortably than turning my head to another screen.
Also a dual/multiple setup bothers me for losing the mouse boundaries when it crosses to another screen - I'd rather have the mouse bounded on one screen for faster access to menu bars at the edges.
Same, I switched back to a single 27" screen last year. For me it's better to focus on one thing at a time especially since my eyes aren't the best, and I switch between virtual desktops with F1-F4 (or when I use my mac with the 3-finger swipe gesture).
MacOS also has ctrl+left/right for switching virtual desktops. The gesture can get a bit tedious if you're jumping across multiple desktops in one go. I don't think it's particularly ergonomic either.
I upgraded recently, by buying a friends old Samsung Odyssey G9 49" curved monitor off him (he was emigrating). Before that I had 2 x 27" monitors, a setup I had used for ~10 years.
I honestly think the curve is essential when dealing with such a wide display. The alternative would be - as article states - to set it back a little and have a deeper desk so you can actually see the edge of the screen properly. I don't see the point in having a large screen with high pixel density if the edges are not actually easily visible to me without moving my head or body laterally.
The lack of bezels is great though - I'd definitely agree on that front, having 3 web browsers or editors open side by side suits me really well.
As weird as the aspect ratio can be on a curved ultrawide, I think it's also more natural and ergonomic to keep your head/eyes at a constant height and just move them side to side. With a monitor that has a lot of verticality you're gonna have to tilt your neck back more.
It’s different from person to person!, whether the curve is good or not.
I have a ruler flat 55” OLED TV as main monitor. It’s perfect for me. I’m like… 1-1.5 meters from it where I’m closest to it, haha. The edges are further away. It’s fine! – imo / ime.
(The need for the curve is also subtly different depending on how the panel was made. I tried a flat 43” IPS 4K monitor, expecting IPS to be good. And it wasn’t very good. The IPS features in that panel were large enough to affect viewing angle.)
> It’s different from person to person!, whether the curve is good or not.
The amount of curve also varies a lot between models so there's some nuance even within that. The curve might be as strong as 800R or as weak as 2300R depending on the monitor, where the number corresponds to the radius of the circle the panel follows in millimeters.
Same, though I'm also on 49" (5120x1440). They're selling them for extra cheap on Amazon with extended (36mo) warranties because they're prone to breaking, but I had the Samsung contractors out here this month and they did a great job fixing mine that randomly died one day -- for free! If you're a chill soul, I'd say it's worth the risk.
I sound like a shill, so Samsung plz hmu. $999 for a beautiful OLED monitor that fits a terminal, a browser, and 4 (font size 8...) 100col text editor windows is a gamechanger.
I use mine for productivity only (I don't game at all) and it seems the consensus is OLED's no good for things like Kconsole/xterm (-style) windows and general text readability, though.
32" Odyssey G7 is the pick for me, I wouldn't mind an upgrade to the 4k version, but the 1440p version is more than good enough.
I also don't see the point in having a screen so big I have to move my head, or contrarily a screen so big that I have to push it back so the pixel density matters much less.
Low response time (i.e. time it takes for a pixel to change color) to reduce ghosting, and a high refresh rate up to 240 Hz.
These monitors are expensive and do not have very high resolution. If you're not a hardcore fast reflex gamer, and you spend a lot of time looking at text, then IMO it's better to buy a higher resolution monitor for less money.
I think at that point it’s not really conscious any more? It always takes me a little while to realize my monitor somehow went to 30hz, and that’s why I’m feeling something is off.
4K gaming monitors do provide a reasonable middle-ground between "extremely fast but only 100-110ppi" and "extremely high res but only 60hz" now though. You can get 163ppi at 144hz without breaking the bank, which isn't quite retina by Apples definition, but it's good enough for me considering the benefit of high refresh rate.
I'm guessing because it allows you to set the Field-of-Vision to be pretty wide?
I mostly play simulation games, particularly flying, and having a wider FoV makes things easier, until you're ready to go to the top step of using VR instead so you also get depth perception and essentially 360 FoV since you can rotate your head.
I wonder what the math would look like to properly render 3D scenes onto a curved display. Could it be accelerated as well as the regular matrix operations used for perspective projection onto planar screens?
During the pandemic I did try out my 4K TV as a game monitor. I had a combination of furniture so that I could sit rather close with my eyes approximately half way up the screen, with a keyboard and mouse in a reasonable position. Then, using an older FPS game I got it to where my laptop GPU could hit good frame rates and I adjusted the game's viewing angle to match how the screen fit my field of view.
It was deeply immersive in spite of me being so close I could "see the pixels". The only time I've felt more immersive was demoing Quake in a 3 wall + floor CAVE at a national lab decades ago.
> I wonder what the math would look like to properly render 3D scenes onto a curved display. Could it be accelerated as well as the regular matrix operations used for perspective projection onto planar screens?
The math is pretty simple to account for a curved viewport, even though I don't think any apps actually care about that. Most displays aren't curved enough to make it a meaningful difference.
We don't have fixed function pipelines anymore either so that could definitely be handled by hardware.
This used to be much more true, but almost all PC games support 21:9 now and 32:9 support pretty common too. "most games" screwed up is an exaggeration IMO. Even on games that don't officially scale, on PC they almost always have customizable FoV that gets the perspective correct again. Many modern games are even smart enough to rearrange the UI so that the critical info (health bars, ammo counts etc) is in the center of the display and not attached to the edges.
PC games have kinda been forced to support ultrawides whether they like it or not - the 21:9 class especially has exploded in popularity for gaming PCs.
I've gamed in 32:9 for years now - I wouldn't go back. The curve is not exaggerated enough to be a meaningful projection issue on most curved displays and games.
It's the curve that messes things up. It's just significantly more incorrect on wider displays. Many monitors are 1800R, and that's easily curved enough for the projection error to be quite pronounced at 32:9 using a planar projection.
The proper monitor height is when the top third of the screen is at or slightly below your eye level when seated or standing upright. This positioning helps prevent neck strain and allows for a comfortable viewing angle.
The top third of a large TV will be much higher than that, which will cause long term discomfort.
That's why large monitors have much wider aspect than TVs.
Yep a huge monitor sounds good in theory but you end up with neck and eye strain from panning your head constantly unless you place it so far away that it’s effectively a regular monitor at a regular distance.
Would recommend a black background Vscode theme for an OLED. The black background with red accents looks beautiful, at least on my smaller XPS 15 4k OLED. I use Dobri Next Black with some customizations but it looks good by default as well.
Correct me if I'm wrong - in OLED monitors, a black pixel actually means a powered off pixel. So it's a good idea to use as much as black in static areas to prolong the monitor life.
Maybe if you put what your working on at the top of the screen and only looked at that one thing it would be a problem, but realistically you use a bigger screen the same way you'd use a bigger desk: you dont put what you're working on out of reach, you just have more room for the tools for what you're working on.
Have been sporting a 4K LG CX48 OLED since ~Sept, 2020 best monitor decision ever. I've got two HDMI out cables, 1 going to my gaming rig and the other for my Macbook where I do my work as a developer.
I haven't noticed any burn-in or dead pixels. You need to set it up for success, enable all the burn-in prevention settings the monitor provides (static image darkening, pixel shifting&cleaning). It's also a great idea to do other things such as sleeping the monitor after 1min if inactivity, no screensaver (or just black), black desktop background, hide taskbars, etc
edit: to add, i have the monitor mounted to the wall and about 1" above the height of my desk[1] - this puts the center of the screen directly at eye level
Have you managed to get good text rendering? I still can't find how to get good sub-pixel anti-aliasing working for the text sizes I want to use on my LG 42" OLED.
It's otherwise awesome though, for the type of gaming I do (non-competitive, so stuff like D4 and Cyberpunk) it's completely unmatched.
I just wish it was a little bit better for smashing out code without annoying text fringing that distracts me.
I stole one of these from Best Buy for $500 in march. It’s just so good. I haven’t turned off the local dimming thing with the service remote so that’s still a thing but damn is it such a great monitor. And for gaming cyberpunk at 120hz with hdr melts your face.
According to https://tools.rodrigopolo.com/display_calc/, a 65" 8K like the one in the article is retina at a 26" viewing distance (136 PPI). For reference, a 27" 4K screen has 163 PPI, and is retina at 21" by the same math. A 27" 5K (like the Apple Studio Display) has 218 PPI and is retina at 16".
The DPI of this screen is too low for all the drawbacks. Would rather have crisper text (150+ DPI, 200 preferable) and/or be able to carry it myself. Needs to be about 42" for that.
One of my larger concerns about using TVs as monitors is security - its important to remember that these devices are not just displays.
ACR / Automated Content recognition means that they are also capable network connected framegrabbers and analysers.
If you connect the network to also use it as a TV, if you don't also tick all the right "I do not give you permission" boxes ( and the firmware actually respects these choices ), then your TV will be uploading signatures of everything you watch ( including HDMI inputs ) to the TV manufacturer for them to sell on to advertisers, political parties etc.
While I believe the normal mode of operation is just to upload some kind of image signature / hash, I doubt there is anything that technically or legally stops screenshots being uploaded for "additional analysis".
Note that this is incredibly lucrative for the manufacturers, so their incentives are minimally aligned with protecting our privacy or security here.
Also with the advances in local AI processing it seems realistic that the image analysis gets much more powerful and suddenly you have something like Microsoft's Recall but in your monitor/TV (and used _solely_ for profiling and not your benefit).
“It can display seven equally spaced vertical columns of text (critical importance), has driver issues (minimal importance), wake issues (who cares), it costs as much as four smaller monitors (this is good), I need a huge desk (hell yeah), there are multiple image quality issues (well it’s not like I have to look at it all day)…”
It is like “I spent fifteen hundred dollars on a multitude of hassles due to purchasing the wrong type of display, but due to the lack of bezel this is a prime efficiency move “
Only the first one (dirty screen) is a real issue, but it is subtle and irrelevant to programming; the second one (checkerboard), as the post explains, is solved by toggling an option in settings.
> Driver issues
The post explains that it works perfectly with current NVidia drivers on Linux, and on Windows both AMD and NVidia on Windows have had driver support for HDMI 2.1 for years.
I chuckled at "The 8K display is only $1500 at BestBuy!" the "only" lol I spent $400 on my projector that I use for my main screen and it works great. But when I did that I had previously only bought $200 projectors. So even that was not an "only" for me.
I've never spent more than $75 on a monitor. I only buy used. Monitors depreciate like crazy and businesses are constantly getting rid of them, even when they're only a few years old. Yeah, you aren't going to get some 9001Hz 10K giga-OLED whatever, but I'm a programmer. If it displays text with reasonable contrast without hogging my whole desk, it does everything I need it to do.
The most expensive one - the $75 one - is a 24" 1920x1200 IPS display with HDMI, DP, VGA, 2x DVI, S-Video, and YPbPr composite. Never seen those last two on a monitor before, but there they are. I don't use that display as my main one anymore, but I keep it around because it's awesome and it plugs into literally anything.
Remember dropping a grand on a 30- inch 2560x1600 on the day and thinking that was the ultimate.
I The 40 to 45 inch is the ideal, otherwise screen real estate goes too far in the peripheral vision.
The other issue was a lot of really big screen. Real estate is managing lots of Windows. With dual screens you can usually been in Mac's ride of applications more easily than with one cuz when you Max on the super big screen it just takes up everything.
And pushes the usually the most relevant stuff is the upper left hand corner that goes to the upper upper left left corner, which actually is pretty far out of your main field of vision.
But I still love the 43-in 4K TV I've been using since 2010 or so
No, I wish! It's 1080 the picture isn't amazing but it works fine and it's 100" on my wall across the room from my couch, so I'm happy. I've toyed with the idea of 4k projectors but they're usually magnitudes more expensive than 1080 projectors!
Sorry, this recommendation will probably disappoint, it's from Walmart. It's a Vankyo Performance V700W. I can't necessarily recommend it.
I have a problem with over-shopping for things, such as spending too much of my life researching and frustrated before either never buying, buying above budget, or just impulse buying making the research time wasted. So, if I can instead work with something I can drive to Walmart and spend $300-400 on, I am happy.
It's been fine but it's nothing special. Does the job and the picture is pretty clear when focused properly. It's bright enough and has good color and picture quality for my purposes. It's 100" on my wall across the room from my couch so we use it a lot for gaming and watching videos. For programming stuff it works but can't optimize for space in the IDE by bumping font size down like I would with high DPI monitors.
I'm also on my second one as the first was left on constantly and started to develop dark spots. They were kind of fun to watch as they'd start really bright and then fade but obviously only in hindsight because it made the screen hard to see. The last time I bought it, the price was dropped I think it was <$200. I have had it for about a year and turn it off when I'm not using it and it's holding up a lot better!
It is and not very high quality. Sorry I didn't mean to recommend everyone get a projector here or pump projectors. I just enjoy my setup and it was relatively cheap!
It will do screen mirroring though. It has 2 inputs and I use those directly and it doesn't offer apps or anything from what I've seen.
I'd give a lot to go back to my 20 year old eyes that could see pixels without special glasses. Sure I can't see pixels (well maybe I still could on an janky third party CGA monitor from 1983), but it isn't worth it. (I'd say save your eyesight, but realistically I'm not aware of anything you can do to keep it past about 45)
I think you'd have to sit further back than is otherwise natural (and then have the issue of legibility/lost workspace) to achieve "can't see the pixels" on this.
Sure it's 8K but it's 65", it's only got a PPI of 135. For comparison Apple (computer) displays and a handful of third parties that target Mac use are generally 200-220 PPI. That is can't see the pixels density, even if you smash your face against it.
220 ppi output with no subpixel rendering (ie modern Macs) has clearly visible jagged edges in angled lines and letters if you've got good vision or correct your vision to better than 20/20 (my case: I get headaches if I don't).
If you are coming from typesetting world, laser printers from the early 1990s did 600dpi (dots per inch), and that remains sufficient for smooth lines, though newer printers will do 1200dpi too. Going down to 300dpi printouts is crap.
Heck, newer Kindles do 300ppi and that can clearly be improved.
Apple's "retina", like all things in life, does work for 90% of the human population as advertised, but there's still a big number of people who have better angular resolution than what they target.
There's always "better" of course, but my point was more that "can't see the pixels" doesn't usually mean "I can't see the pixels if I sit back from the display a bit". When the iPhone 4 was introduced, no one said "what's the difference, I can just hold my (other phone) away from my face further and I don't see the pixels!"
Can you reference an example that shows this phenomenon with angled lines? I haven't had an eye test specifically, but my vision is generally fine, and I don't see the effect you're referring to, on for example a lower-case "y".
I have a 55" 8K and I can't see the pixels while sitting 2ft away. Everything is crisp and I have a huge workspace. For mac I use 4k native so 2x integer scaling.
I've used both. I quite honestly don't care. I've heard many people that share your sentiment. But some of us just don't. Visible pixels are totally fine for me.
I went back from using different displays in HiDPI to using a single 43” 4K screen set to 100 % scaling. Screen estate trumps invisible pixels [for me, at the moment].
I didn't see any mention of how many times he has to pick up his mouse when it gets to the edge of the pad to get the mouse from one edge of the screen to the other.
Author here: I use a Logitech G Pro X Superlight but also I use the i3 window manager and rely on keyboard shortcuts for a lot of the navigation. I have the mouse sensitivity set so that the cursor can traverse the width of the screen when moving the mouse about 13 cm, without any acceleration. This is still precise enough that I can move the mouse pixel by pixel if needed.
I find it annoying that they've kind of got rid of mouse, tails and other easy ways of finding the mice pointer.
That's one of the main drawbacks of a massive screen is if you lose the pointer it takes a lot longer to find it. It's not a linear scale based on the width of the monitor. It's with the square.
So 50-in monitor is going to be about four times longer to find a mouse pointer than a 30-in one.
I don't like those hotkeys where you know it highlights it. I like the the mouse tail. That's the one that I can most easily find it. But generally those came out of fashion about 15 years ago
Pointer trails are still a feature in windows last I checked, and hitting ctrl to animate a circle around it works pretty much everywhere. I don't use either of these features nowadays, and usually find my cursor by moving it until I get to somewhere with high contrast.
I haven't seen anything I like quite as much for quickly finding the cursor as macos's "wiggle for giant cursor" feature.
The example he's chosen is of a ridiculously sized TV. 65" is living room TV size.
There are smaller, OLED displays that would be more suitable(while still rather big). Many are 'just' 4k, but the smaller sizes should give one a decent pixel size.
LG CX series are literally #1 recommendation on almost any article about getting TV for PC gaming and maybe some work, for many years. I don't think OP is honestly looking around.
Have you had issues with image retention? I also like the 43” 4K setup for some things, but these days it seems IPS screens in that size are not as easy to find, I’ve always been wary of OLED due to burn-in
I got my 8k 55" tv for under 1000 usd several years ago. Brand new, from a brick and mortar electronics store. So it is definitely possible to make 8k monitors for less than 1000 usd.
A mere 55" with 8K resolution makes no sense as a TV, but it's glorious as a productivity monitor. But instead of becoming commonplace as monitors, the panels seem to just be disappearing even as TV's. At the moment I can't find anything at any price that can replace my current setup.
The market isn't working for monitors. Everything available now is either crap, or costs 10x more than it clearly could. Millions of people are spending years of their lives in front of bad screens because monitor makers don't want to make good ones.
I feel like Apple's 30 inch 6k display would be the sweet spot for me, but its 60 hz and cost what.. $6,000 ? I just use 27" 4k monitors for work. It's fine but I'd definitely like something a bit bigger and even crisper. I have to use windows for work though.
With the LG I'm about a meter or less away from screen and use window management tools to pull focus to the center lower section for any focused work. I run Win 11 from an RTX3080 card with a 2.1 HDMI cable. 3840x2160 120Hz.
For gaming I just use windowed mode and use the full width of the 65" but just the lower half usually for COD or FPS games. I don't notice any eye strain or other issues but do run everything I can in dark mode including using the browser with the Dark Reader extension.
Not for my use case. Even as someone who's been active in the AR/VR industry for 10 years plus, it's more comfortable for me to look at a screen than it is to wear glasses. I've tried working in xreals, quest 3, with virtual desktop, etc. They're pretty good - just not as good as a monitor or in this case TV. I'm confident over time things will improve and eventually might be on par but there's plenty of use cases where you might want a screen and that will likely persist. Thanks for the question!
From experience with a 55” 4K OLED as main monitor, I can attest that the length if the caveat list is not indicative of the total impact of the caveats. It’s more an indication of a thoughtful and thorough person writing the list.
I went with the LG CX model based on what I read on rtings.com
That’s a previous-generation model. I think all of the LG TVs are good.
There are / were technical caveats. I believe all of them are solved by M3 macs that have HDMI 2.1 ports. (M3 or M3 Pro or something? The ones advertised as 8K capable.) Out of the box, those will do 4K 120Hz HDR with variable refresh rate and full 444 color. This is what you want.
It is possible to get that going on older machines, except for VRR which is more of a nice-to-have anyway.
I have a 2018 Macbook Pro 15”. Disclaimer!: My setup was a “complexity pet”, a tinkering project; There are simpler ways to connect a 120Hz 4K HDR HDMI 2.1 display to a non-HDMI-2-1 mac. And! My tinkering project wasn’t only about getting the display working correctly. It was more about messing with eGPUs and virtualization and stuff. Definitely a long way round.
On my Intel mac, I use an AMD Radeon 6800 XT eGPU with Club3D or CableMatters DisplayPort-to-HDMI 2.1 adapters. Plus some EDID hacking which is easy to do.
EDID is how the display identifies itself to the OS. The EDID payload can be overridden on the OS side. Mostly it’s about copying the display’s EDID and deleting the entry that says the display can accept 4:2:0 color. Only then does macOS switch to 4:4:4 color. I also created a custom “modeline” with tighter timing to get 120Hz going fully.
—Please be assured that this was way more complex than it needed to be. It was for fun!
There are much easier ways to do this. Lots of forum posts on it. On the MacRumors forums iirc? User joevt is The Man.
And even then, what I wrote above is actually easy to do once you know it’s possible.
Mostly though you really want an M3 Mac that just has HDMI 2.1 and is ready to go.
There are/were also OLED gaming monitors available, such as from Alienware. Those have DisplayPort inputs and are ready to go with almost any older Mac. Might be able to find one for a price equivalent to a TV, idk.
I believe the discussion about text rendering is referring only to a line of very cheap TVs that do not in fact have RGB pixels. They have half RG and half GB. For "normal" video content, this is a surprisingly low quality drop. For high-contrast text it's total murder. You can see the stippling pattern as clear as day and it can easily render 8-10pt text literally illegible.
IT once accidentally bought such a TV and had it in a conference room. Took us a while to convince the relevant people that, yes, it is nominally working fine, it's not "broken" in the sense that it doesn't turn on or half the screen won't light up, but it was intolerable for Zoom screen shares.
But you need to be scraping the bottom of the barrel to end up with those screens. I doubt you could find something labelled a "monitor" that has that, and, well, if you're putting a $150 40" TV on to your computer... I mean... what did you expect?
(There are also low-end TVs that are still using some crappy LCD techs with bad viewing angles that may make them difficult to use up close, but I wouldn't call that a text rendering problem... those issues just wreck everything. I once had a laptop that when used on a lap, had zero viewing angles; if the vertical middle of the screen was correct, the top and bottom was extremely visibly color shifted. Even the cheapest store brand TVs don't seem to be that bad anymore, though.)
> I believe the discussion about text rendering is referring only to a line of very cheap TVs that do not in fact have RGB pixels.
It also comes up with very expensive OLED monitors, which do usually have true RGB or WRGB pixels, but their subpixels are usually not arranged in the standard horizontal RGB stripe which breaks most implementations of subpixel font rendering. With a sufficiently high pixel density it doesn't matter, but with the ~108ppi of a 27" 1440p OLED monitor the text rendering can be quite visibly worse than a 27" 1440p LCD.
> TVs may have a different subpixel layout than monitors, so small text may suffer fringing. As of writing the Samsung VA and LG IPS panels such as the QN800A have a conventional RGB or BGR subpixel structure. One may also increase the font size or use hidpi scaling which will eliminate all pixel-level concerns.
I am excited for 8k monitors in the future, because they give you a lot more options for integer scaling than current 4k displays.
I know this a nerdish hill to die on, but I hate fractional scaling with the blazing fury of a thousand suns. To get a 1440p sized UI on a 27" 4k display, you can't just divide by 1.5x the OS has to 3x/2 for every frame. OS X does this best as they've had retina displays for a while, but no OS does this well, and it leads to all sorts of performance issues especially when dealing with view ports. Linux is especially bad.
Having said all that, I absolutely will not be using an 8k tv as a display. I'm currently using a 27" 1440p monitor, and while I could probably handle a 32" 8k display that is the absolute max size I'd tolerate. You start to get into all sorts of issues with viewing distance and angle going larger.
My 27" 1440p is fine for now. I sit far enough away from it that I don't really 'see the pixels' unless I go looking for them. It was also a crazy good deal as it's a 144hz monitor that also has a built in KVM switch that's very useful for WFH.
I wouldn't describe any OS as 'flawless', they're all doing what I describe under the hood. QT does have better support than GTK atm. I've also seen bad behavior on windows, esp with older apps. OS X is about the best out there, but even it can have issues with applications that have a view port (i.e. video editors, etc).
I'd prefer to skip all that so I'm happy staying on 1440p until 8k monitors are where 1440p monitors are today with regard to price and quality.
27” 1440p at 100% is too small for me, so 5K at 200% has the same problem. More generally, the available PPIs combined with integer scaling only yield relatively few options at a given viewing distance. More choice would be nice.
That's simply too big a screen to be sitting right in front of.
I do agree on the basic idea of not running two monitors tho. I used to, and I got neck pains eventually.
My current setup is a single 32" curved QHD monitor and I wouldn't change it for the world. It's just the right size so you can see the whole screen at once, yet large enough to run 3 browsers side by side.
Also, I want to suggest people to learn about virtual desktops rather than wasting money on bizarrely huge screens or multi monitor setups.
If you have it setup right you can flip to the other desktop quick, see what you want and flip fast. I haven't seen a good virtual desktop implementation since around 1998 though, and have given up.
Easy to switch, easy to know what you have on each one. Easy to move windows to different desktops. I go years between trying them and always get frustrated by how poorly they work, but it has been so long since my last attempt that I don't remember exactly what annoyed me last time I tried.
55" is not too big. Maybe it's too big for you, but I've been using three 32" 4k screens in portrait for many years, combined they are essentially about the size of a 55" screen. I love it and anything less kind of sucks. No, virtual desktops are no substitute for having more screen size. I use virtual desktops on my massive screen(s) and I love that too.
The 3 32” screens are probably angled around you and the total aspect ratio is extreme widescreen (side to side panning, not vertical neck up down panning). The 3 screens are likely much much better ergonomically.
I'll never understand why some people think multi-monitors involves neck pain. It's a complete non-issue.
I have a chair with a headrest, my neck never has to move at all, I can see all parts of the screen just by scanning my eyes. Very rarely does my neck ever move and it's usually to see the 65" Tv a few feet over - and even then my chair has a swivel and if I'm watching the Tv more than the computer, my chair will swivel over, not my neck.
> I'll never understand why some people think multi-monitors involves neck pain. It's a complete non-issue.
I can only speak from my own experience, and from what I heard others expressing online over the years.
For me, it was absolutely real and I thought for a while I had some other medical condition.
My issue was that I spent too much time of the day reading of the monitor to my left, twisting my neck while doing it. After a few years I started getting worse and worse neck pains.
I have seen this confirmed many times since by others online.
So please be aware this is an issue, but I am happy that your body still haven't said no.
what are we talking about here, is your head twisted to face a monitor at 90 degrees or what? With my 55" equivalent screen setup the left and right monitors are about 10 degrees rotated from the main center screen, there's practically no rotating my neck at all. None. So I'm not sure what you're doing but if you're having to twist your neck to see your monitors, there's something not optimal about your setup. I have tons of screen real estate and never have to turn my neck.
I have been using a 55" 8K QE55QN700BT ($1400) at home since Jan. 2023 and a 55" 8K QE55QN700C ($1100 on sale) at the office since August 2023.
I can tell you that I will never go back, but there is definitely room for improvement.
Biggest negative: Sitting close (~12"), the far areas are probably at >45° angle (and TV colors are not great at angles)
Eye strain is ok (lowest brightness & low contrast), but neck strain is a thing (which I never had before, but now I think my neck muscles start to be trained and it's getting better).
Price is amazing, compared to 4x 4K
Refresh rate 60Hz: not for ego shooters (but great for work)
I didn't think it was possible, but at work I operate an 8k, 4k, and a full HD screen from the same graphics card (i need the other screens for UI testing)
Glossy is not great (no matte TVs on the market)
Even at 12", I operate Windows at 175% scaling, so I still have pixels to spare. Next TV will be 65" :)
At the beginning I thought it was a failure, until I found the correct settings combination: TV Gaming Mode, disable all AI image gimmicks, set the correct refresh rate in the NVidia driver options (at first everything was extremely sluggish, until by chance I saw that the refresh rate was set to 30Hz by default). I also remember playing with the V-Sync settings until finally I didn't see visual pixel artifacts anymore. Now the quality is the same as my 4k monitor.
Work colleagues just can't see it... (just like they laughed at the 2nd monitor, and then at the 3rd, and now most use 3 monitors).
No hassle with monitor arms. Such set it on the table and done.
Microsoft PowerToys FancyZones is amazing to divide the screen into areas
Next: TCL unveiled the first curved 8k 65" TV, that's where I'm going :)
I saw your reddit comment, but I couldn't find the source for the TCL 65 inch? For me im tempted to go from 43" 4k to 55" 8k, my only concern is the edges being too far away unless its curved.
12" viewing distance?! It sounds like you should be spending more at the optometrist and less at Best Buy. I cannot even imagine sitting that close to a screen--don't you have to turn your head just to see both sides of a document?
It's probably more like 18" currently. I don't know, what's your reading distance to a book page? I use FancyZones to dividede the screen into 3 vertical areas (like having 3 monitors). For coding and longer reading periods I use the middle area. But yes, to look at the side areas I have to turn my head or even slide the chair a little. The angle could be better, like I said.
Regarding reading distance: For me the important part is to sit straight and avoid hunching. Choose a combination of distance and screen scaling factor that works for you...
Beware of backlight offsets. TV panels can have smaller backlights, because they're meant to be viewed from further away, and my LG 46 monitor didn't have backlight behind the lower 2-3 rows of pixels and a couple pixels on the left and right, when viewed at my desk. This may not impact some people, but I often go full screen text and missing some of the left and bottom pixels was annoying. I ended up able to configure i3-gaps so that it never displayed anything in those areas, solving the problem. It worked great as a huge monitor otherwise.
I'm not a fan. Large ultra-wide curved screens are fantastic. With large flat screens that are meant to be viewed across the room, you get a distorted image when you sit up close. Your eyes have to focus further away as you look at things closer to the edge of your screen and the viewing angle for that part of the screen is different from the center of the screen. It also requires more effort for your eyes to look up and down rather than left and right. We're hard wired for that horizontal plane. This makes ultrawide screens a really comfortable option.
I almost bought an 8k 55" screen for use as a monitor, but I tested a 55" 4k screen for a week and the flatness is what turned me off to it. I've been using three 32" 4k screens in portrait, arranged in a "curved" config on my desk (2 monitors on each side are mounted at an angle), which I really like. But switching to a large single flat screen was not fun.
For me the holy grail of monitors is a 55" 8k curved screen. Not "ultrawide", I want the full width and height and I want it curved, with full 8k resolution. Maybe someday, but I'm not getting my hopes up too high.
I'm not the guy you asked but I have a similar opinion on flat screens. Personally I'd want spherical. ~15" tall and ~25" wide is about my limit for flat screens, anything beyond that I find that the corners/edges are too distant/distorted. My home setup is multiple independent 27" screens, which I like. My work setup is a single flat ultrawide (34" probably?), and I find myself physically leaning my head/body from side to side when I have two windows open next to each other. I have eye level a few inches from the top of the screen, and the lowest couple inches also seem distant/distorted.
Back in my daze at Boeing, I had a full size drafting table in addition to the usual desk. I've always wanted a display that big. In fact, I want my entire desk surface to be such a display!
I also recommend this. Been using 43inch 4K TV since last 10 years. MY first TV (Vu 43inch Iconium) died this year, Got another 43inch 4K TV from LG (43UT8050) now. Ensure you get one that supports atleast 60hz refresh rate. My first one did not at 4K. It even starts faster than android TVs. I always keep on game mode, this setting ensure minimal input latency and no TV side post processing. The smart TVs don't need to be connected to internet, since I don't use the smartness of theirs. Finding dumb TVs is difficult here.
I'd be happy to, but there aren't any 8K TVs at 55" or smaller. I want the pixels, but I'm not going to put a 65" TV on my damn desk -- I have two 27" 4k now, and it's ... fine, I guess? but I want a 42" 8k running at 2x.
On most monitors I've been using these days, I keep scaling the resolution down. I've noticed that the bigger the text, the more comfortable my eyes feel. I still prefer a good high-res monitor because it scales down with less blur
I went through a phase of wanting the most possible screen estate to do sick multi tasking gimmicks like having chats, documentation, code editor, and prototype open at once. It was glorious, a 5k2k ultrawide monitor filled to the brim with a mishmash of sometimes related, sometimes unrelated windows.
Then it hit me that I can only focus on one thing at a time since I’m a human being, and having multiple attention grabbing things in front of me is never good. I now run a single Studio Display and have a code editor in full screen, switching to other content through virtual desktops. I’m WAY more productive this way.
Now I might just have a short attention span and that’s that, but using a TV as a monitor sounds like hell to me now.
I’m doing something like this in my current home setup, but the thing I miss most about multi-monitor is screen sharing on Zoom.
I used to be able to just share one entire monitor and could drag windows I wanted to make visible to that display. Now I tend to share single applications, and have to unshare and reshare to change the view.
First world problems and all, but it would be nice if Zoon let you partition off a part of a display (instead of all or nothing). Would love to draw a bounding box of “share everything in this box.”
I don’t think this annoyance is enough to make me go back, but there are times when I’ve considered it.
Deskpad might be what you’re after! It’s a virtual display in a window, you can share that instead of your whole screen but still get multi-app flows captured
This is something I've wanted to do for a while! I wish Samsung still produced their 55" 8K displays-- 8k @ 55" gives you effectively the same PPI as a 27" 4K display. Maybe someday.
That's a hell of a desk. And counter to the argument that "you could just have the one huge screen for entertainment AND work" because this is not a desk you can easily clear out from in front of the sofa when you stop working.
This is making me want to get some blackout curtains for my living room so I can go back to occasionally working with my laptop hooked to the projector, though. It's about the same resolution as my laptop but it's really nice to be focusing on something across the room for a change.
I use a 50" 4K TV as my monitor. It's mounted on a long TV mount that can bend at 3 points, one near the wall, one near the TV and one in the middle. Gives me great freedom. One warning to people who want to do the same: make sure your mount has a way to rotate (around the screen's surface normal) the TV as the weight of it will make it sag.
Am I the only person who wants a monitor that's curved in both axes (left/right and up/down) so I can surround myself with a sphere of monitors, and then pivot on a gimbal?
it's around 100 degrees while humans can see more like 180 degrees (more if you move your eyes; I don't want to move my eyes, I want to gimbal my body to focus on a specific monitor) although outside the center of your vision, you don't have good "resolution". The Vision Pro would be like being inside the sphere, but with a big aperture blocking all the side monitors
For anyone 50+ of age wider monitors are harder to use than multiple monitors specially if you don't use varifocal glasses. I bought a 47 inch monitor some time ago but had to go back to a multi monitor setup with smaller monitors because I can't see the sides and corners of the larger one without moving or standing up. Works better if you have a standing up desk though.
I use a 55" TV as a primary monitor at my desk also (alas only 4k). I've found that as tempting as it may seem to use the entire 55", at normal eye-distance, it's too big an area to track and it requires 45 degree head-rotations to scan across it (i.e to shift my focus from one window to another). It's like sitting in the front row of a big movie theater.
What I do instead is configure the desktop to only use the equivalent of about a 36" display. That way it works like a single large monitor that I can scan left to right with mostly just eye movement.
Why have a big TV as a monitor only to use a smaller portion of the screen? The TV reverts to just being a TV during non-work hours (when we use its full dimensions), so it doesn't feel like wasted real-estate - in fact it frees up space for me in the office by doing double-duty as a monitor and a TV.
I've been using 50" 4K/60 TV's (3x actually) as monitors since 2015, and I love them. Prior from about 2007 on I used 6x 24" LCD's, and in wanting to upgrade, didn't make sense to bother with small LCD's to go vertical with another row for 12x displays. I found Samsung curved 4k LCD's at the time for around $650 each shipped around black friday, so it was a no-brainer. I've never looked back really, or would consider anything smaller now.
I am wondering how 8k displays would look replacing my current samsung 4k's as these are pre-HDR, but I'll probably use these until they start dying with no complaint. Plus no one does curved displays now, which I'll miss from my current TV monitors.
I've always wondered why everybody would buy "monitors" for computer use. Isn't it the same thing as a television screen? Back then TVs used to take different inputs but everything is digital now.
That checkerboard effect is certainly interesting. Someone somewhere is going to be nostalgic about this artifact someday, maybe they'll even make a shader to emulate it. I wonder what causes it and why it disappears in game mode.
> on Linux it took about two years for 8K 60 Hz support to work, spawning a salty thread on GitHub
All I see is paying customers asking for support.
> The AMD on Linux fiasco is because the HDMI Forum has prohibited AMD from implementing HDMI 2.1 in their open source Linux drivers.
That's weird since nvidia's open source driver has an implementation.
TVs generally have more input lag, poorer color fidelity, and except at the high end like 8k the pixel size is often inappropriate for viewing close up.
There's less of a gulf now than in the past, but TVs are generally made for media watching at a distance.
Heh, I do something similar as well, with a 48" LG 4k OLED, which seems popular with other users as well. I got this over another 4k or 8k TV because 1) OLED simply looks better and 2) 120 hz is nice for gaming, but I do want to get the same type of TV but with 240 hz instead for some of the higher twitch games.
I use Windows and the PowerToys utility which might arguably be the best window manager I've used, even about tiling window managers on Linux, simply because I can specify exactly the layouts I want for every single virtual desktop and every single app.
Overall it works well but for the first little while I did get a headache from sitting too close, but it went away soon after.
It's strange how Retina-level PPI has mostly been ignored by the market. There are some options, but not a lot, and certainly not much competition. Would like to upgrade from my 1440p 144hz Nixeus IPS display for awhile, but haven't found much.
I'd be willing to give up >60hz refresh rates, which pulls in the Apple Studio Display, but it also only has one port, which means I'd be giving up my gaming PC. That was mostly on the way out anyway, but it's currently a psychological blocker.
Has monitor tech stagnated, or is this another case of the tyranny of the majority (less-than-Retina PPI) crowding out everything else?
I've been using TVs as my monitor for probably 10 years now. I've been on my current Vizio 50" 4k TV since before covid. I think I bought it for $280? I do everything from programming, watching or up-scaling videos, and playing FPS games on it. My only issue has been vsync related tearing in certain games like God of War for PC even though I have vysnc settings on, but other than that, no problems. I do well in fast-paced Fortnite games, so latency hasn't been an issue either. I'm near-sighted and hate wearing glasses, so having the larger fonts has been a blessing also.
I use dual 27" 144kHz 4K monitors and am mostly pretty happy with my setup though I have considered moving to an Ultra Wide curved monitor, I'm just not sure if the OCD side of me would be bothered by the curvature.
Unless I'm misunderstanding, one of the advantages of using physically distinct monitors is that it's easier to send things into a full screen mode without affecting the other displays - I guess apps that support "borderless windows" are less of an issue.
Maybe there's some type of cross platform (Mac, Lennox, Windows) virtual display driver software that can allow you to create "picture in picture" virtualized monitors though?
>Unless I'm misunderstanding, one of the advantages of using physically distinct monitors is that it's easier to send things into a full screen mode without affecting the other displays - I guess apps that support "borderless windows" are less of an issue.
This is one of the reasons I stuck with two monitors instead of one long one when I upgraded a while back. I know there are workarounds and helper programs you can install and whatnot, but I like being able to drag something to the side and full screen it without any additional hoops. Plus the long monitor crowd tend to have things centered on the screen and then have small accessory areas to either side instead of two distinctly large screens. Plus resolution wise, unless you're going with a really wide monitor, you probably have more overall resolution with two screens, especially if price is a factor at all. Standalone 27" monitors are basically the standard and are priced accordingly.
My Dell monitor has a picture-by-picture mode which works very well to simulate 2 distinct displays. Each side uses its own video input. Many higher end monitors can do this, unsure how many TVs can.
I've tried all sorts of configurations and I settled on two 27" 5K monitors (in HiDPI mode) on a VESA mount so they can easily be rotated for different use cases. The biggest selling point was text clarity. I really like using my code editor in vertical mode.
My previous setup was a 34" curved ultrawide and I didn't like it. Primarily due to the poor text quality and generally feeling it encouraged too much distraction.
I like being able to have very separate concerns per-display -- code editor on one display and browser on the other, and then I use virtual desktops on each one where needed.
> 8K TVs may be driven at 8K 60 Hz with no chroma subsampling by using HDMI 2.1, which is available on all current (Nvidia RTX 4000 series and AMD 7000 series) and previous gen (Nvidia RTX 3000 series, AMD 6000 series) graphics cards. Older computers with GPUs outputting DisplayPort 1.4 may use adapters such as the Club3D one to achieve 8K 60 Hz.
Isn't "plain" DP 1.4 confined to HBR3 - thus its maximum refresh rate is 8K-30Hz?
If you’re looking for a monitor with high pixel density and a ton of real estate, you can also buy a monitor. 5k2k’s are pretty sweet. I’m driving one of these nowadays and it’s fabulous, without all the quirks of adapting a huge TV for computer use: https://www.dell.com/en-us/shop/dell-ultrasharp-40-curved-th...
HiDPI, two 4k monitors without a bezel, 120Hz, and no need for a separate thunderbolt hub.
>I don’t usually watch movies on this, but when I do, I set it to 4K 120 Hz mode. The 120 Hz is nice as it is divisible by both 24 fps and 30 fps, which are common framerates for movies, although The Hobbit running at 48 Hz would benefit from Variable Refresh Rate (which does work).
This is naive speculation lacking empirical support. Most movies are released at 23.976 fps, which would necessitate at least one double frame being presented per minute at refresh rates which are multiples of 24 fps. Also there is no 48 fps version of The Hobbit currently available on home media.
I'm embarking on a similar geek journey. Just today I bought a used radiology PACS display (barco mdcc-6430) just to see if there is anything novel or cool about the picture or any clinical features. I'm not expecting much but stuff like this is how you find out.
This display is color, however I have considered getting a grayscale only rads display for "ADHD purposes" i.e. the same reason people are interested in e-ink displays (well, one reason).
It will probably be a huge waste of time and money but I'm just a masochist for tech pain I guess...
No, I miss monochrome displays too. Yes, sure, color can provide some benefit in syntax highlighting, but that's a pretty weak argument.
I wonder if there's an interesting reseach opportunity: how complex does a syntax need to be before highlighting is beneficial - and is it ever really? I don't think highlighting ever helps me understand code or even more quickly grasp it - it's mainly just a way to cue a syntax problem.
I used a tv as a monitor for a while and it was great -- but there is one problem with single monitor setups -- screen sharing/recording. If the app you're using lets you select a portion of the screen to share, that's great. But something like Slack you either share an app window, or the entire screen. This is very annoying in a single monitor setup. It would be amazing if you could select a part of your screen and tell the OS "treat this area like a separate monitor".
I have used one of the original 4k TVs-as-a-monitor ( https://www.avsforum.com/threads/review-of-the-seiki-39-4k-d... ) as my central monitor (plus one on each side) for 10+ years now. Not feeling any need to upgrade (don't do graphics/games, just lots and lots of text terminals and browser windows)
At home I use 2 28" 3:2 4k displays and in the office I use the same setup and 2 additional 24" WQXGA-Displays and I like the ability to spatial arrange windows and corresponding tasks. My mind just doesn't work the same with one huge display. I even noticed this back in the day when multiple displays meant 2 17"-19" 4:3 or 5:4 displays and the first colleagues started to use the first 30" displays with 2560x1600.
I use a single curved 57" 32:9 DUHD monitor (Samsung Odyssey NEO G95NC) for work and gaming. Previously I used 3 24" monitors, but I like this setup a lot more.
I split it into 3 sections (browser for docs, and rest terminal/nvim), but i can easily change this if I want to show slack for example. For gaming I go fullscreen (and use overlays for stuff like VOIP or browsing) because it is a lot more immersive.
I have used a Samsung 43” 4k smart monitor with my mac mini for sometime as a replacement to multiple screens and it was a productive experience. However, I had to stop in 2-3 months as my eyes would get dry and itchy at the end of the day. Using such a large screen at such a short distance isn’t great for the eyes I think.
> You can even use the same TV for 4K 120 Hz gaming or watching movies as a bonus!
But you can't use the computer at the same time then. With a 3 monitor setup I can add an HDMI switch to one of them, and when I want to play, then I can switch that monitor to connect to the PS. This way I have still 2 monitors to use. Then one can be used for TV in the browser and the other one for other stuff.
Assuming that those audio speakers are at ear height (I assume they are since those IsoAcoustics stands allow tilting but there is no tilt in the picture) then IMHO the display is placed too high, ideally you want your eyes level just below the upper edge of the screen. I don't blame OP though, I just think with this type of screen size, it is challenging to achieve that.
I used a 32" non-curved 4k monitor for a few months once. At some point I realized that I was moving my head around a lot as the corners were at an awkward place. On 28" I don't have this.
So anything above 30-ish inches I would consider either curved (expensive for hidpi resolutions) or two/three 27" screens angled a bit.
I can't imagine how bad it would be on a 65" flat screen.
I wanted to go down this path some months ago, but couldn't find any options on the market. I ended up with a 42" 4k LG C3, but it's just "ok" because I can easily see pixels. I wanted to use the room as dual use work/watch movies, but without the need to watch movies I'd probably go back to a wide screen curved display.
I sorta tried this, using a single one of those large 4k curved monitors at my desk in San Francisco before the pandemic. It was alright, but I always liked having two 2k monitors more. At this point, as an Awesome WM user (there are dozens of us!), I really depend on having two different monitors so I can have two different sets of tiling window tags.
I already use a 4K TV for a monitor. 8K would just push a need for a more expensive video card, while decreasing how well people can see when I share my screen. Even on a 4K, I need to blow it up to ridiculous zoom levels to make a screen-share readable to others.
I'm sure not everyone would run into that problem, but it is a fairly strong con to be aware of.
If you are on Linux, you can divide the entire screen into multiple virtual monitors and share only one of them. This has the benefit of giving you "private" monitors what won't be shared.
Another option could be to temporarily lower the resolution.
I used a 43" 4k TV to replace a multi-monitor setup, and the neck and eye strain was brutal for me. Even with a really nice display with a high refresh rate, viewing the corners from that close up was worse than useless. The brightness was difficult to tune down enough to reduce eye strain and from that close up reducing blue light through software wasn't very helpful.
I've since switched to a 32" 4k curved display (still 16:9, not ultrawide) and have been much happier. The curve makes more of the view useful from the periphery and the display has some quality-of-life features, like displaying multiple inputs as separated and ratio-configureable "monitors" in hardware. It's also nice to have controls on the display; the TV relied on the remote, and I kept losing track of it.
The only thing I miss is being able to switch to watching sports at the end of the work day, and being able to cast video to it. Those were luxuries duplicated by other things already in the house. I'd like to say I miss gaming on it but I honestly don't, it's much nicer to not have to extend the keyboard and mouse back far enough to also see the entire display at once.
I work mostly with text and code so the curve isn't an issue, and I could see designers preferring a flat panel to avoid distortion. Otherwise I'm not sure I could go back to having such a large display, much less a 65" display.
EDIT: Per another comment, I have mild hyperopia diagnosed about a year into using this setup, which continued for another year after getting glasses to correct it. My prescription has not changed since getting the new display.
I've been using a 34" 1440p curved ultrawide monitor (21:9) since 2020 and it's been amazing. Earlier this year I decided to try using a 42" LG OLED TV as my monitor and lasted about a day before deciding to go back. I 100% agree with you RE: viewing the corners of the flat screen. I'll never go back to a flat monitor/TV for my primary PC again. I think my ideal monitor is ultrawide, curved, 1440p, OLED, and 38" or so.
The new Macs with M4 Pro/Max have Thunderbolt 5 which supports up to 120Gbps video bandwidth: 3x that of previous gen. That should cover 8k120 or multiple 8k60 displays! And the regular M4 now supports 8k60 displays I believe due to DSC.
New ideal setup is 8k120hz TV placed about 3-4' away. Better for the eyes too.
In my experience when a flat monitor gets too large the edges tend to be too much further than the center and as I glance around, my eyes need to refocus too much. That’s why I vastly prefer curved screens. I currently use a 4K 32in curved monitor by MSI and for me it’s just perfect
Just got a 32" 4k. I had a 49" 4k in the past, but it broke. My issue with monitors above 49" is it strains the eyes and head looking around. I always had to partition the screen or manually resize, it got annoying. Gonna try 1 4k for landscape and 1 for portrait now.
For me, the best monitors by far for programming are LG's 28in DualUp, due to the aspect ratio. I have a pair side by side, and it's effectively four 1440p screens in a 2x2 layout, giving lots of vertical space without a bezel as well as horizonal on each screen.
The one issue that I have with using TVs as "monitors," is that they are too damn "smart." They play with the images, and it can be a devil to find all the settings, to turn them off. On my Samsung, there's a couple of things that I can't turn off.
I love my Acer Predator 43” 4K. It’s small enough that I don’t feel like I need to extend my desk to sit far enough away, and it also just squeaks under the max load for the Ergotron HX monitor arm.
It’s extremely sharp for normal use, and doubles as a 4K 120Hz monitor for gaming.
I'm not sure if they ever shipped it to any retail customers. I'm a JVC projector owner so I kinda follow JVC projector news. The higher end JVC PJs are used by Boeing for flight sims:
JVC accommodates that use case with things like extra chassis mounting points to allow the projector to be mounted securely in a dynamic environment. This looks like it may have been an early POC in native 8K for Boeing.
I use a 4k TV. I've wanted upgrade to 8k for a while, but according to this post AMD on Linux can't do 8k so I guess I'm sticking with my current setup.
My 780M already struggles running GNOME at 4k, so maybe that's for the best.
I moved from a 3x 24" 1080p monitor setup to a single 43" 4K display... and holy smokes what a difference. The only thing it's missing is some curvature. That's needed at larger sizes.
I got a 4k 32" IPS monitor years ago and was never really happy with it for similar reasons. Scaling never felt right, 2x is like basically using 1080p and fractional never looked good or had issues (in linux). Now I have a giant curved samsung display which is like two 1440p monitors side by side. There is a new version that is like two 27" 4k monitors side by side and I am just worried I would have the same issue I had before... I kind like being able to see my pixels :)
That’s typical of tvs. The signal is delayed by a few seconds because for passive entertainment why not. You will likely have a mode for your tv that does no post processing and has minimal delay Often called pc or gaming mode. Look up “[your tv model] gaming mode”.
For 1 and 2, I would say it totally boils down to personal preference an distance/size ratio. For 3, again, distance to the screen matters a lot.
The 4th one I've seen the most heated discussions about. In my opinion, highest you can afford (both money-wise and computational power-wise) is the most useful resolution. Even if you can't distinguish the individual pixels (aka screen door effect) aliasing is still an issue.
Ultrawide 5K2K is a great sweet spot, at least for what I do, which includes a bit of everything. I never liked dual monitors with a split in the middle. Ultrawides solve that.
I have a 34" from LG, but I absolutely CANNOT recommend the brand. It's been unreliable, and is currently being repaired after the panel died on me.
However, when it worked, it was an absolute game changer.
I have a feeling they might not be able to repair it, in which case I am getting a Lenovo P40w-20 which is currently on sale for $1099 at Walmart for some reason.
But yeah, we're talking $1k+, if you want both high-resolution (5K) and ultrawide. If you're ok with good old 16:10 aspect ratio, then a 5K monitor is much more affordable.
I'd love to do this but always worried (probably incorrectly) that the energy output wouldn't feel great and result in faster fatigue or require more rest breaks.
My wife asked me how much "huge monitors" cost. I told her 100 bucks on Craigslist. Indeed, we got her an old dumb 1080p LCD and she has been super happy with it. It mostly fills the wall of her little cubby hole in our office.
For my money, I have 2x 1080p 24” displays, and a third curved 32" 1080p display which is hooked to a KVM so I can game on it.
I like the 3 monitor setup because they are all at angles from each other, approximating a huge curved display. Plus, this was a cheap setup off woot.com parts.
I think monitors are like headphones. Unless you actually try the "better" ones, you don't have a clue what you're missing. I know because I had been saying "Dual 1080p 24" is all I will ever need." for a long time until I got a 4K 50". Now I can't imagine going back.
I usually use a pair of Sennheiser HD280s that I've had for over a decade. I've used some fancier headphones costing more than an order of magnitude more, from brands such as ZMF. After experiencing the high-end advantage, I'm still perfectly happy with the 280s. There are a few things I care about in a monitor, and DPI is nowhere on the list. Every monitor commercially available has more resolution than I care about. My number one concern is consistency across a wide viewing angle. Low latency, retina DPI, gamut accuracy, HDR, curved surface? I don't care about any of them. I have tried all of them.
May I ask at what configuration? I'm assuming at least one is vertical because I can't think of a way to set 2 43" monitors horizontally without breaking my neck.
I want 4k/5k and an 18"-21" diagonal, but all the hi-dpi smaller screens go to laptops and tablets, I guess. No monitors like that. Hell, under 27" and 4k can be tricky to find these days. 24" models exist but are a shrinking category.
I don't want or need my monitor to take up a huge amount of space. But I do want high pixel density. Looks like I'm in too small a market to serve.
Modern TVs have decent input lag around 10 ms which is on par with professional monitors, but of course it will still be worse than gaming monitors. Lots of people game on their TVs. And most TVs have settings that disable postprocessing.
I guess. I think the important thing is getting the program in your head, not on the screen. If the code is too complicated to hold it all in your mind then more columns of crisp text will not save you.
I actually kind of agree with this. For me, the more pixels the better (I'm sensitive to fuzzy text, and subpixel rendering makes it worse), but I'd really prefer just one monitor, not too big. 15-19" is fine, especially if it's 4:3. 1600x1200 on a 17" monitor would be really nice.
Since a couple of years ago, I spent a year or so like this, with the TV resting on the desk directly.
It looked pretty nice, but it had some problems.
- The only actual 8K modes reported on the HDMI were some variant of YUV, it means you could not select what your OS considered an RGB mode
- Even using it at 4K, with the 55" TV a couple of feet from the back of the desk, my eyes could not keep all of it perfectly in focus.
- The power consumption was much higher than a typical ~30" monitor, and the amount of heat created was also significant. This became hard to deal with in summer.
Eventually I gave up on it and returned to a ~30" monitor.
All else being equal, a TV (i.e., TV-sized) unit generally has a broader set of use cases and longer useful lifecycle than a computer monitor for the original purchaser†, which could be argued makes good economical sense.
† in my experience, computer monitors can have a long useful life when factoring in the potentially long tail of "donor/hand-me-down" cases...
I built my own laptop that uses a framework 13 mainboard with dual 16” >4k 16:10 1000nit HDR panels. It is the ultimate portable battle station. It has a tripod mount that I use with a magic arm to hold it at eye height. I use dual trackpoint wireless keyboards for better ergonomics.
I've long considered going this way myself, but 8k is tricky for a number of reasons:
- I am very sensitive to glare, and all TVs are glossy
- Smallest size you can get is 55" (up to 50" would be good for me as I keep my 32-incher on a custom 8" stand — I am pretty tall — so it would simply be a wider screen that goes to my desk with top being at the same point)
- Connectivity sucks: I am so used to running only my laptop with a single USB-C connection: I had enough with early Dell MST 24" 4k screen that required 2 DP 1.1 connections IIRC (basically the same thing their 32" 8k has).
- I've mostly use Linux (Mac for work though)
So I am waiting for a monitor that can do 8k at 60Hz with an ultraportable that runs Linux and an iGPU that can drive it for productivity (software dev, browsing, video calls — yes, full screen video call is a hog at large resolutions, at least in Linux).
I'll probably sacrifice on the resolution front next (4k at 32" is not enough either) and go with a 4k option at 42-43" people have mentioned elsewhere.
Be aware that TVs make screenshots every few milliseconds and send them away for advertising analytics. I wouldn't use them as a monitor while connecting them to the internet.
Nope.. I want to be able to properly split the screen in different inputs because of the lack of proper window / workspace management if you're not using separate monitors
> TLDR: If your job is to write code all day [...], buy an 8K TV instead of a multi-monitor setup.
Counterpoints:
• All my keyboard muscle memory is setup for multi-monitor setups. Theoretically fixable with the right tiling window manager... which I would presumably have to install, since I do too much Windows stuff to go full time Linux. Or perhaps develop. Buying more monitors is a better use of my time.
• I curve my monitors inwards, intentionally, for better viewing angles. Also lets me hide a tower in one of the corners behind the curve on a straighter desk.
• I do too much multi-machine development (e.g. testing refactoring of multi-platform abstractions.) HDMI switches are super convenient, your TV's picture-in-picture functionality... may or may not be. Dual Windows PCs for testing on nVidia and AMD simultaniously, or remaining unblocked when busy reformatting/reinstalling/compiling/linking/syncing 100GB+ on one? Yes please. It's often interactive enough to want to keep open, yet passive enough to need something else to do. OS X for iOS and Linux for debugging server code? Sure. iOS and Android? Well... those have their own monitors. Consoles don't though, and I've targeted those too..
For an entertainment setup, I can usually scrape by with 2 or 3 monitors (1 landscape for fullscreen game, others typically portrait for chat/wiki/etc). Right now, I'm on a 75" 4K chonker. I have good eyes, but 8K would be a waste of pixels, and I'm already close enough that the viewing angles are noticable. Yet, I still hauled out a second monitor: an old 2.5K to exile junk I want to monitor off the main screen.
For a development setup, I've bought or brought a 4 x 27" 4K setup if one isn't provided. A 5th monitor has occasionally been useful (1 landscape for console, 4 portrait for console IDE, devtools, devtools IDE, and docs/wiki/jira/chat/notes. Replacing the 4x portrait with 2x 8K landscape... would probably work, at least, although I'm not convinced it'd feel like much of an upgrade, if any.)
TL;DR: You can't really replace a monitor wall with a single screen because it does not curve to create the right viewing angle, which makes text seriously unreadable at the edges, which forces you to seriously upscale the font size, which steals the largest amount of real estate possible. Of all the compromises to make, reducing the number of screens is one of the worst ones.
4k screens are already somewhat questionable for productivity for this reason alone. The only serious argument that can be made is 1440p vs 1080p (personally I would argue for 1080p, if using bitmap fonts and having perfect eyesight). A 4k monitor wall is a rather fringe setup, that only works out to an advantage for day traders and weird surveillance applications. And it requires that you constantly do very energetic body gymnastics to change your perspective's location and be able to see all the details. With a single 8k screen without upscaling font size (hence preserving all technical real-estate), the body gymnastics required would be so much worse than a 4k wall, it would be absolutely ridiculous and clown-alike and almost impossible to use while typing. Otherwise people mainly want big 4k/8k screens for dual use as a TV set. But this is just wrong in itself, it creates a paradox for no good reason, like using screwdrivers as chisels. Some things are not meant to be. The only arrangement where 4k makes some sense for common use cases, is maybe above a curved ultra widescreen.
I normally work with a 40", I'm using a a hammerspoon to divide the screen, but normally I end using one main window, with some smaller window at the side and cmd-tabbing between info. How do you manage the distraction of so many information at the same time? Do you switch between apps? use the mouse? don't you loose track of where the focused window is?
People and their need for a "leader". No matter the quality. We had enough "truth tellers" and "follow me men" kinda shills.
Time to realize that not everyone is your friend in the internet. They feed you bullshit all the time and laugh how gullible people are and question nothing, just follow based on perceived merits of an individual.
Huh? One screen for email/slack/.. main screen for the ide, other screen for logs etc. a lot less context switch to glance left/right than to go to another virtual desktop
Apart from programming, one of the motivations for getting the 8K display is to look at lidar point clouds. For example the desktop background in my post is a lidar map of Bernal Hill in San Francisco, which I've here downsampled to only 13006 x 7991 px for your convenience [1].
Admittedly, when I bought it at first, I didn't realize there would be so many random issues, as manufacturers all advertised their gear as "8K Ready" even in 2021. As I incrementally fixed the problems, I decided to document my journey in this blog post.
btw I posted this in the past but it got caught by the spam filter and disappeared [2], not sure how to appeal that when it happens. Thanks ingve for posting it again!
[1] https://pics.dllu.net/file/dllu-lidar/tldr_707_all_c_fine_50... (13006 x 7991 px)
[2] https://news.ycombinator.com/item?id=41102135