The decision of ICE essentially was to not change anything. A student on F1 is only permitted to take certain number of online classes and must maintain his status with in person classes. The change to online classes means this affects ones status and the ICE notice meant they will not make any changes to these rules even in this situation. What people are saying is that this affects ones work status if he leaves the US post study but ideally a student is a student not seeking work or immigration into the US especially given that the H1B is suspended and OPT is not work it’s training. If the university says class is online then it means there is no reason for a student to live in the US. People confuse students with long term immigration even if that maybe the intention of the student. They are just students who are paying tuition and living in the US as far as anyone is concerned.
> People confuse students with long term immigration even if that maybe the intention of the student. They are just students who are paying tuition and living in the US as far as anyone is concerned.
Even if it's only short term for the duration of their studies, it seems quite unfair to expect students who have setup in the US to return back to their home country for a term or two when they might well then have to come back to the US again the following year.
My experience has been that the students who are there for highschool 2.0 spend the most and many foreign students need to find work to make ends meet. They are typically more motivated and focused than local students, spend less time going out (tho travel more) and far more on studies. This is also typical of continuing edu students.
That is irresponsible. For studying in the U.S., students have to rent an apartment, purchase necessities and do lots of other preparations. If they are not allowed to stay in the U.S. anymore, who would cover the loss for them?
Things are not so cut and dry. Rent in the college town my university is located in is high and people have to sign year-long leases the year before they actually occupy the apartment due to high demand. Forcing students to travel home places unnecessary hardship on them if they've already signed an expensive lease.
Doesn't the US hope that those highly educated students stay in the US afterwards? I was always under impression that this was the main goal of those programs. Get the smartest from other countries, show them how good it is in your country, and hope for them stay and contribute to the economy.
Absolutely not. Maybe the US should, but according to the rules, a student visa is "nonimmigrant intent" and expressing the desire to stay in the US after your studies are over may in fact be grounds for denial of a visa because they may think you're going to overstay.
F1 is a "non-immigration" visa - meaning that it is explicitly designed so that the students do not stay after finishing the studies. There are "dual intent" visas which are designed with the intent of possibly staying, such as H1B. But technically, expressing intent to stay can be a reason to be denied F1 visa.
> Get the smartest from other countries, show them how good it is in your country
That may be true, but that's not what F1 is made for, by design.
Many of these students pay the bills of the department as graduate only students, and for the students it's a jump point to the US job market. Both enjoy the benefits of a misaligned program.
Depends on the administration, which is really a problem if you want to consistently attract talent. Seems like an opportunity for Trump to ostensibly change nothing while taking what looks like a hard-line on immigration.
There's an added benefit to Trump of punishing universities for their continued pushback against much of his goals (e.g. Title IX lawsuits, harboring undocumented people/not complying with ICE).
> Doesn't the US hope that those highly educated students stay in the US afterwards?
I'd say the average American does not hope this, no. Foreign students are driving up the cost of tuition as they're usually quite wealthy and displace seats for underprivileged Americans, especially in state-run schools.
I don't know why someone would come to the US to study other than the hopes of immigrating. There are better schools in Europe and elsewhere that are much cheaper or even free.
On the off chance we actually face some sort of shortage of critical skills, there are other visas for that.
That's not true. Almost all state universities charge a much higher tuition fee to foreign and out-of-state students (sometimes two to three times higher). That higher fee is used to subsidize the fee of in-state students. Sorry to be crass, but please do some research before spewing your bullshit online.
> That higher fee is used to subsidize the fee of in-state students
An uncited, flimsy excuse. Most universities could function just fine without out of state students, but that might result in a cutback of some of the wasteful spending these institutions create.
In any case, the seats are a finite quantity. Most public universities in the US have competitive admissions. Guess who's not getting admitted in place of students on visas? That's right, the most underprivileged parts of our society, which is why I said seats.
Even if "it's only classes", there are things called time zones; if you're in Europe or Asia then it would fucking suck to have to get up everyday in the middle of the night to take classes.
This ignores the enormous logistical difficulties associated with, say, timezones. If my online classes are at 2 AM but the grocery store is only open from 9-5, I am going to be exhausted all semester.
It is not this simple. Some international students are on a significant financial aid and rely on the resources provided to them in the U.S, be it college housing and dining or else...
You don’t need old hardware to verify the speed differences. If you just take a browser in i3wm on bare arch Linux install that will be faster than in windows10 on the same machine. At least snappier. It’s just significantly lower overheads in Linux because you can tailor to your needs and remove all bloat. Also windows GUI is just more taxing on the system compared to Linux so that is going to be better at rendering something like a simple browser. Linux is slightly faster in some benchmarks(like geek bench) but overall the differences are minor when comparing apples to apples. It’s things GUI related where Linux has the edge in performance and some of the recent games have for some reason performed better on Linux than Windows according to an LTT video which is due to some real interest and development in Linux gaming.
I feel like this argument is the same as ‘how third party apps are allowed in windows and macOS by Microsoft and Apple’
To me there has always been a trusted part of computing which is audited to some extent and marked as trusted. Browser extensions work the same way as software on an operating system. If they blocked all extensions outside trusted ones they would be criticised as well. However the auditing process is very controversial and could become like the Apple’s App Store where the apps/extensions maybe blocked for reasons other than just security to make it anti competitive which is certainly something possible with chrome
In 2016 we proved that the owner of "Web of Trust" was exfiltrating and illegally selling clickstream data to anyone who would pay. For Germany alone the data contained the browing information of more than three million people, often revealing highly intimate and sensitive details about their lives. Still, Chrome and Firefox reinstated the extension after less than four weeks, and to this day it keeps collecting clickstream data. It does so using dark patterns and I'm sure most of the users are not aware that a free extension they use to increase their safety while surfing the web surreptitiously sells their browsing data.
If the main selling point of your browser or OS is that you protect the privacy of your users you simply can't act like that, because most users are not aware of the data collection that is happening via these extensions.
With mobile apps we're in a similar situation, companies like X-Mode exfiltrate and sell location data via apps that claim to protect your privacy. Desktop software: Same story. Anti-virus software that is supposed to protect you actually exfiltrates personal data from your computer.
So yeah if you build an open platform there will be such abuse, but if you position yourself as a champion for privacy you simply can't allow that (or at least you should try to make it more difficult).
There are simple counter-measures that browser vendors could employ: Showing users how much data a given extension sends to a backend and ideally making that data transparent would be enough to stop most of these practices, because people would then realize that their free screenshot app somehow sends every single URL they open to a backend service. Right now this can happen entirely without the knowledge of the user. You can't control what you cannot see and understand.
> Showing users how much data a given extension sends to a backend and ideally making that data transparent would be enough to stop most of these practices
Exactly! I find it abhorrent that not even Firefox has something straightforward like that as a “first-class” feature. Most of the extensions I use shouldn't need to communicate with any server at all to begin with, so having to just trust the author's words or manually audit the code on every update (or stop them altogether) and maybe fork the project (if that's even possible)... Doesn't make sense.
The one thing I'm aware of that these extensions could do to sidestep such a mechanism is to inject scripts on pages that then exfiltrate your data, but injection could also be blocked, and as a last resort I trust uMatrix would have me covered ;)
I am outsider interested in this topic, it would be great if you provided some links. I've found Web of Trust addon [1] and its Privacy Policy [2]:
> Automatically Collected Information
> Internet Protocol Address (trimmed to permanently remove specific location information other than country, city & postal code); device type; operating system and browser; Search engine results page (keyword, order/index of results, link of result, title, description, ads); web pages visited and time stamp of the visit; display ads; and WOT user ID.
That is awful [3] but that's what almost every web page wants to do. Privacy Respecting browser should not run javascript, ignore cookie, and block non 1st party resources. That's what my browser does. But this is not where consensus lies.
As I understand Mozilla allows to collect information if it is defined in Privacy Policy. It would be great to have badge "Collects Information".
The data under "web pages visited and time stamp of the visit" is your clickstream data (you can check which data the extension sends using the network tab in the extension developer tools, though some extensions go to great lenghts to obfuscate it).
Danke schön. Schade, aber mein Deutsch ist nicht so gut.
Most of the users live in Privacy Nightmare and accept it. They also run closed source OS and applications. The truth is privacy has a cost - monetary (Apple ecosystem) or time/experience (Linux etc).
Apple can hire maintainers, Linux users can become maintainers. Those who live in free as beer land has free as beer support.
The initial part perhaps falls into the widely accepted consensus of monitoring usage, but an extension collecting all"web pages visited and time stamp of the visit" crosses the boundary to totally unacceptable.
I think the difference here is that browser extensions are distributed through a package manager provided by the browser vendor. People expect the vendor to perform some sort of security validation on apps that it is effectively publishing. You can't reasonably have the same expectation for binaries downloaded from a website.
After using a rMBP for 6 years, I realized that using a lower resolution and lower quality display makes absolutely no sense at all if both graphical power and budget is available (first 13' rMBP had some serious issue driving the display). Better quality image is better quality image. I think Apple's biggest selling point over any vendor right now, despite numerous issues with its hardware and software in the recent past, is absolutely top class input and output. It is such a simple concept. A great keyboard (seems to be fixed now) and absolutely incredible trackpad experience along with a display that basically is a huge step up from your past experience means that most users will prefer that setup even if they just use it for basic coding or web browsing. After looking at the first retina displays I realized that Apple didn't just change the displays but it changed how fonts behaved completely because the crisp and clear legibility was key to attract customers early on. I'd say even in 2020 most computers are struggling with good displays which can completely ruin the experience for someone using the product even if every other aspect of it was great.
I've been focused on Apple display products for the past few months as I'm looking to make an upgrade from the Dell P2715Q 4k 27" I use primarily for development.
There is a three page thread on using the Apple 32" XDR for software dev on MacRumors. [1]
I believe there is a major product gap at Apple right now in the mid-market display. Specifically a replacement for the LED 27" Cinema Display which was announced 10 years ago next month. [2]
I am speculating that Apple could announce a new 27" 5k in part because of the rumored announcement of a new Mac Pro but also because the build quality of the LG 5k Ultrafine is just not great and there are obvious synergies with XDR production and Mac Pro.
I think this should be announced at WWDC is because developers specifically are being left out of good display products and Apple should be looking out for what they stare at all day.
While there are no supply chain rumors of such a display, I wargamed what this product might be and its pricing anyway.[3]
In short, I speculate Apple will release a 27 inch IPS display, 5120 x 2880 @ 60Hz with standard glass at $1999, Nano-texture Glass at $2799.
I had not paid a lot of attention to the refresh rate, but it does seem like kind of a miss that the XDR does nor offer this.
I have a Dell P2715Q and I'm happy with it. I haven't tried anything with a higher resolution or DPI, but I can't see any individual pixels on the Dell, so for the moment, it's good enough for me.
I can't go back to sub-4k though. Looking at a 24" 1920x1080 monitor tweaks my brain. Pixels ahoy. It's jarring. I'm not the kind of person who cares about superficial things or style or brand at all, but I just can't get comfortable with sub-4k anymore.
Be very wary of Apple monitors. If you can, try one out in the environment you intend to use it in, before you commit. Apple displays are highly reflective. The glare is obscene. The display quality is great, but I can't deal with the eye strain. It's like there's a mirror glaze on top. They used to offer a matte option, but I don't believe they do anymore. It's painful.
This is an important point. I have a pal with an aging 5k imac that he wishes he could use only as a monitor with a new mac mini.
It seems display tech in imac is severely discounted in order to sell the whole thing. And it does seem to break the pricing if you don't see them as different products offering value for different configurations.
Bringing back target mode, or allowing the iMac to act as an external monitor would greatly increase the value of that product.
I can't explain how this pricing makes sense exactly, except that I can only assume Apple will want or need to price this stuff high to help differentiate the build quality in comparison to the collaboration on LG's ultrafine.
I bet somebody could make a business refurbishing those old 5K iMacs into usable monitors. Either yank the panel into a new case w/ a custom board to drive the display, or hack an input into the existing system (maybe with a minimal booting OS).
Either way, that'd be a cool product to see and seems like a decent side hustle to get some $. :)
I recently bought a 2015 27” 5K (5120x2880) iMac core i5 for cheap, put in an SSD, and upped the RAM to 24 GB. It handles everything I can throw at it as a developer (albeit not a game developer, just “full stack” web/java/node/k8s) and the screen is just incredible.
The SSD upgrade is not for the faint hearted, however. I used the iFixit kit with the provided “tools”, and lifting off 27” of extremely expensive and heavy glass that sounded like it might crack at any moment was not exactly fun. Having said that - I would do it again in a heartbeat to get another computer/display like this for the price I paid.
With regards to scaling: I have had zero problems with either my rMBP (since 2013) or this iMac, when in macOS running at high resolution. Zero. As soon as I boot Windows or Linux, however, it’s basically unusable unless I turn the resolution down.
I've got a Planar IX2790 and it's great. Article is spot-on re. scaling -- it's much nicer to look at all day (native 2x scaling) vs. 4k at 1.5x scaling.
Thanks for this idea. The design looks surprisingly like the old cinema display. Actually, apparently they use the same glass from the cinema display but no camera. [1]
It looks like the main concerns on this are around stuck pixels. Have you gone through calibration / QA on yours? [2]
Otherwise, seems like a compelling alternative to the Ultrafine.
I believe the theory that it's 5k iMac panels that Apple rejected. I have a few stuck pixels but I never notice because they're so small. I have a blue stuck-on in the lower-right quadrant and I can't even find it now.
I kind of thinking such Display's "Design" could overlap with the new iMac. One reason Apple used to having a "Chin" in the iMac was to distinguish it as a Computer and not a Monitor / Cinema Display. Judging from the leaks, New iMac would not have a Chin at all, and since there is no similar sized Cinema Display in the Line up this doesn't really matter.
I just wish they bring back Target Mode, or something similar to iPad's SideCar.
- Proved and debugged on the XDR release at the pro price point.
- Kept working on how it can be cheaper and fit into plans for whatever the ARM-based machine's initial graphics capability will be.
- Designed it to use a similar manufacturing line to the iMac then offer it at a lower price point to support the current mac pro, mac mini and a possible dev kit for the arm mac.
Or I suppose just keep making everyone buy the LG 5k ultrafine that is four years old. :P
I have just got a U2720Q, it worked at 60Hz right away. But I am connecting over USB-C, with the provided cable.
What it did find off-putting is that Dell say they don't test with Macs, and thus can't support them, but they just assume that Appl follow the same specification they do. A bit weak, but the monitor does work fine.
Thanks for this idea. The product page says it is 218 PPI, but I don't see how that is possible given it is 34" wide. Pixensity says teh actual PPI is 163.44. Can you confirm the PPI on this monitor?
It's 163. I think the 218 you got from a Cmd-F and picked up the SEO blurb for the 5k ultra fine in the footer).
I cannot tell the difference between the LG and my MacBook's retina screen at my normal viewing range, but this may just be my eyes.
I also use a non integer scaling factor on both screens as I find it has the right combo of resolution and real estate for me, and I don't notice artefacts.
I honestly find MBP trackpads to be too big, which is admittedly a preference thing, but their keyboards are absolutely horrible, and I have difficulty understanding how anyone could think they were "great". There's not enough distinguishing keys from each other, so I can't ground myself to the home row. I can't think of many keyboards I've used throughout my life that I enjoy less than the MBP. And that's not even mentioning the quality issues (duplicated or broken keys), or the lack of Fn keys.
For the trackpad I agree, but definitely not for the keyboard. And apple was late(!) to hires screens and to hidpi and now has lower res and density than the competition (e.g. 16:10 better 4k on dell xps or 3:2 screens with MS and others).
Also, apple had TN screens forever when the similarly priced competition had higher res IPS screens.
Isn't 4k on a laptop a significant power drain? And isn't the point of "stopping" at Retina resolution that the human eye can't tell the difference between Retina and higher resolutions like 4K at typical laptop screen size and viewing distance?
>"Of course, the real highlight is that new Retina Display. Its resolution is 2,880x1,800 pixels, providing a level of detail never seen on a laptop before. The highest standard Windows laptop screen resolution is 1,920x1,080 pixels, the same as an HDTV."
>The signature feature of the new MacBook Pro is the new 15.4-inch (39.11 cm diagonal) Retina display. [...] So far, no notebook screen has topped resolutions of 1900 x 1200 pixels (WUXGA) or 2048x1536 (QXGA in old Thinkpad models).
If you manage to dig up some obscure laptop that had a higher resolution at the time I wouldn't be completely surprised. However, to suggest that Apple was "late(!)" with high DPI screens is provably false and frankly, ridiculous.
I had a 1080p Alienware laptop in 2004. Then for some reason, laptops all went even lower resolution for a long time, and I couldn't find a good one until Apple came out with their Retina displays. Not sure what happened. Manufacturers just became cheap?
I've found the keyboard on my Surface Book to be much better than the keyboard on any of the MBPs I've used and owned.
The trackpad was just as good, too. But of course, the OS was worse, mostly in terms of performance. Windows 10 just always feels slow for some reason.
The problem, as explained here, is also what Linus and recently GKH have mentioned is that the lack of any consumer ARM hardware like laptops and desktops mean that kernel developers or app developers are basically not able to develop and test their code easily to be able to deploy it in the cloud ... if Apple was able to sell their macs with ARM and not only that but also provide instances of the same platform then they might be essentially at the core of architecture revolution
I have both Spotify and Apple Music but somehow the Spotify interface to me seems just simple enough that I can play what I want and the interface gets out of the way. With Apple it’s almost like they want you to listen to what they want you to listen and not what you want to listen to. This is what basically companies trying to push their content do over what content you prefer and it almost always ends up in bad interfaces.
On the other hand recently I restored my iPhone as new. The default settings may give users prompts to disable tracking but Siri is now a big tracker itself. Siri now learns from apps how you use them by default. While this may not be advertising and data may be on your device or with a trusted company like Apple, the idea of privacy should be that by default everything is opt in only. Who’s to say Apple is tailoring your iPhone behaviour ‘to your needs’ when in reality they are just trying to make you invest more into their ecosystem by learning from you ? That might sound like that’s helpful and might make the experience better but the main idea of tracking is still the same.
I think these chips go in really low end computers like possibly chromebooks or sub 300$ laptops where intel falls short on pricing. But from what I can understand AMD is already selling 300$ 3200U laptops which maybe perform similar to these.
Nonetheless I find it really amazing that only 3 years ago 2c/4t was normal for high end flagship laptops like xps 13 with 7200u and now those chips are lowest end chips being offered.
All tribalism apart, that's all AMDs merit. If it was up to Intel we would still be getting the same old overpriced dual core crap 3-5% faster than their previous one.
quad-core workstations have been available for years, I literally have one from 2010 and I doubt it was the first even then. There were probably Core2 Quad or Phenom X4 workstations.
AMD had their own quad-core laptops available for a long time too, the Excavator/Piledriver series goes back many years as well.
Cheap dual-cores have been around forever as well, this is not something that AMD has brought on with Zen. This is not really a distinctive product in most ways, it's just a replacement for those old cheapo Excavator products for bargain-basement laptops. So I'm not sure what you're trying to say here other than 'AMD good'.
dual-cores have always been the standard for ultrabooks, but those laptops are literally defined as eschewing performance in favor of making it thinner and lighter. You make a processor that pulls half the power, the workstations will take twice the cores and the ultrabooks will slice off another 2mm of thickness and stay with dual cores. Zen did not change this dynamic.
Yes, Zen2 is a good product, this is not a Zen2 chip, nor is it really notably higher core count or anything else. It's a replacement for A8 piledrivers.
>Nonetheless I find it really amazing that only 3 years ago 2c/4t was normal for high end flagship laptops like xps 13 with 7200u and now those chips are lowest end chips being offered.
MacBook Pro has been shipping Quad Core SandyBridge since Early 2011. Along with many others in the PC world.
At the low end, in Chromebooks, I think Arm processors are a better way to go. These days Chromebooks run Android apps and Android on x86 hasn't been great.
Gemini Lake is really good in the $250 to $350 segment. They have great hardware video decoding/encoding and the general CPU performance is between Core2 and Nehalem level, while offering Chromebook level battery life.
There are Chinese laptops like the Chuwui Lapbook Pro that get you a 1080p IPS screen, a quad-core Gemini Lake, and 8G of RAM for $320. I'd rather do that and be able to run standard x86 software at semi-reasonable speed than to mess with a chromebook. (I can see the draw for a momputer that you don't want to mess with though)
I'd love something in the 7nm or 10nm class, of course. If Dali was on Zen2 on 7nm it would be fantastic. It's just not possible yet in this price range. AMD is still launching 14nm in this segment, not even Zen+. Next-gen Atom (Skyhawk Lake) is going to be on Intel 14nm as well.
I'm curious how Dali/Barred Kestrel does on the extreme battery life/chromebook thing though. Raven Ridge (and the Zen+ successor, Picasso) did not have great idle power and this hurt it there. If they could get the idle power down, it would be a good alternative to Gemini Lake.
In many segments, low-end included, the processor is a small part of the total BoM. Display, memory, clamshell, keyboard, assembly and shipping are non-negligible.
One of the fun things about ARM is that you get to play with asymmetric multicore machines that don't exist anywhere else in the desktop space. At least for now.
For me, the worst flaw of Chromebooks is the keyboard and the lack of proper Control, Super and Meta keys.
This paper revealed me some of the things than modern designs have adopted, which the regular uninformed coder would never notice (which he may not need to most of the times). But recently I have been looking into HFT computers and I see that these things are running C code (a friend works at a small startup who said they use C programs for most of the orders) on regular computers with most even using off the shelf hardware (intels 9900KS and 9990XE are hot targets and anandtech and servethehome have shown off some hardware). HFTs are highly sensitive to optimizations and it is my understanding that the lower they go to the hardware, the better returns given how competitive it can get.
With so much in the middle from a high level C program to low level instructions on a CPU, I wonder if we will see companies like JP Morgan and Morgan Stanley (big ones with money and time to invest) enter the chip business heavily. This could then bring back some of those optimizations to the consumer space and startups in the area of fast and efficient C code then might get into trouble. As of now this area seems to be open to compete.