Hacker News new | past | comments | ask | show | jobs | submit login

I used to hate microsoft for locking us into shitty software for decades.

But looking back I see that they gave us all an enormous windfall in the form of comoditized hardware with decades of hardware growth. (hardware is the complement of OS software, so drive hardware costs down and OS sales go up)

You would think Apple as a hardware company would open up software to increase hardware sales, but instead it seems to try to control everything so it is fighting a battle on multiple fronts.




> You would think Apple as a hardware company would open up software to increase hardware sales, but instead it seems to try to control everything so it is fighting a battle on multiple fronts.

Purely on the business perspective, Apple has seen tremendous benefit with their locked ecosystem and vertical integration. Bringing that strategy to the PC market was bound to happen and it's likely going to work extraordinarily well for their share holders if performance/productivity benefits (from Apple Silicon) at low-mid end forces traditional PC consumers to Mac.

On the consumer perspective, Would we accept a $1000 PC couple of years back with no means to install other Operating System (Officially), Only 3-5 years of updates(if lucky), Use only manufacturer approved apps, Repair only at their approved centres?

Then why did we accept it to be a norm for >$1000 smartphones?

We made them smell money with our consumer decisions to trade 'freedom in computation' in smartphones and it's now coming to haunt us with personal computers. The line between Smartphones and PCs have been blurred with Apple Silicon, Google will do it with their Chromebooks(which was already happening even without their custom silicon [Update cycle, Locked boot-loaders etc.]) and Microsoft with their Surface line up.


There’s never been more diverse software, more readily and easily available, than there is today (mainly due to the web and app stores). Software has never been easier to write, to distribute, or to monetize.

Users don’t care about if the platform is “open” or if they can install Linux. In fact, in many cases, the things are a massive source of pain to end users that want devices that just work, which the iPhone and iPad largely do.

It’s also, by the way, never been easier to build your own hardware from ready-made components and platforms.

I don’t know why we should lament users choosing devices that are easy, fun and reliable to use, and that provide them with single tap access to massive software libraries and entice them to pay for that software. Seems like an absolute win to me.


>I don’t know why we should lament users choosing devices that are easy, fun and reliable to use, and that provide them with single tap access to massive software libraries and entice them to pay for that software. Seems like an absolute win to me.

I'm finding it difficult to see this as an absolute win given that Apple's absolute control over these devices facilitates human rights abuses and a general trend towards censorship and authoritarianism all over the world.

As a developer I don't see it as an absolute win if my distribution channels are dominated by an oligopoly of two all powerful gatekeepers. But I completely understand that consumers don't care about that or even like it.

I also understand that consumers don't care much about Apple's cultural anti-porn bias. It's all on the web anyway.

But what about human and civil rights? Can we really celebrate something as an absolute win if it hands absolute control over our access to encryption to anyone who happens to control Apple?


> But what about human and civil rights? Can we really celebrate something as an absolute win if it hands absolute control over our access to encryption to anyone who happens to control Apple?

How exactly is Apple going to ban encryption? Isn't HTTPS outside of Apple's control?

And how would Apple manage to "ban encryption" on Windows, Linux, or Android devices?


Obviously not https, but if end-to-end encryption or some consumer VPNs are banned then people whose only personal computing device is a locked down Apple device will instantly lose access.

It‘s not going to be up to individual users to decide wether or not to use it anyway and perhaps fight for their constitutional rights in the courts.


Exactly. This is why I'm wary of all these arguments for opening up the iPhone. There is large value (in the form of security and minimum quality bar in app review) in the closed ecosystem.

If you don't like it, it's never been easier to build a replacement, or install a dev certificate on your iPhone and load whatever you want.


Build a replacement…what? iPhone? Exploit for letting you have private entitlements?


Take why Android phone that allows rooting and install your own stuff.


When software has political agenda, stop it is a harm to free speech. Not stopping it will be a threat to business (said to china). Hence all customers not care. Try to deal with the post china world.


If the early pioneers of Computers & Internet thought the same way we wouldn't be even having this conversation. I think they made conscious decisions to keep computing out of total control by capitalism.

Unfortunately we've failed them & ourself with our consumer decisions.


Uh, not really. Usually quite the opposite actually. For instance, Linux, the modern land of free/open computing, comes to us from Unix... which was made by Dennis Ritchie and Ken Thompson, two early pioneers of OSs... who worked for Bell Labs, owned by AT&T (and maybe shared with Western Electric? I forget).

Computers back then were far too expensive for them to be anything but reliant on capitalism. It was only with the commoditization of hardware that free and open computing really even became an option.


True, but Bell labs is hardly an example of normal capitalism. As I understand it, AT&T during the Bell labs era was more like the East India Company, or a PRC-style state-controlled corporation, or Pacific Gas and Electric.

(I find that this is an under-explored option in discussions of socialism-vs-anarchism-vs-capitalism these days. State-controlled corporations seems to have a very good track record. But I haven't seriously studied this.)


I believe they were actually prohibited from entering the computer market.


I disagree. The landscape has changed significantly since the 80s. A locked down platform makes a lot more business sense now than it did back then (unfortunately). Its not like computing company pioneers were so ideological that they turned down $$$$ for freedom. Nor is it like there weren't more closed platforms back then, the closed off platforms just got outcompeted.


> with our consumer decisions to trade 'freedom in computation' in smartphones

TBF, the first few iPhone releases were arguably better and more open than anything before them. Apple refused to bow to carriers and provided a standard development platform for the first time. Then the Appstore, again bypassing carriers, increased developer access to mobile platforms by 1000x or more.

Sadly, both consumers and developers then failed to push for even more open alternatives, to the point where Apple and Google managed to entrench themselves too deeply to address this problem through simple market mechanisms. It's time for authorities to step in, hopefully we're seeing that (slowly) happening.


I don’t think Android was well understood by the Linux and Open Source community in it’s first three years. A lot of people happy that it was consumer Linux and hacking away on root kits, bootloaders, and alternative marketplaces. We didn’t realize Google would have an effective monopoly on software distribution inside of the Android ecosystem (at least in western markets).


What? There are other software stores available for android? and even apks for the adventurous


I recall first iphone couldn't even run apps at all. It had "web apps".


yeah but they refused to integrate with carrier crapware or otherwise customize it in any way to suit the carrier. iOS was iOS and that was it. This seems trivial now, but at the time it was new - most phones would be sold with carrier-tailored operating system and apps, which made it difficult to build anything on multiple devices.


It was quite easy, just like with iOS, you just had to pick one platform (Windows CE, Symbian, BREW, J2ME,...) and target those devices.


Did you actually do it...? J2ME was a nightmare of incompatibilities. You could basically use it only for games, because you took control of the whole shebang. The slightest attempt at integration with platform services or native widgets brought utter pain across devices.

Symbian also changed drastically from featurephone to featurephone. WinCE was a bit more consistent but nobody used it on actual phones, it was largely a PDA os and PDAs were a very small market.


Yes, J2ME targeting Nokia and Sony-Ericson devices, Symbian on Nokia devices, and I used to travel occasionally to Espoo.


You must have few hairs left then :) I worked a bit on j2me on Nokia and ran away very quickly. Cross-device testing was a massive (and expensive) issue. Definitely it was not comparable with the ease enjoyed on iOS today, where there are very few devices and the emulator is enough most of the time.


The only hair I lost on those days were caused by Symbian C++ and the multiple reboots on the development environment, from Metrowerks all the way into the burning platforms memo.


I have an 8 year old MacBook Air. Still getting updates, still working perfectly well. Best 1000$ ever spent on a PC. Zero seconds invested in configuring or setting up anything.

At the end of the day, buying a Computer is a tradeoff. A lot of people would very happily tradeoff freedom for other values if the value proposition is good.


I purchased a MBA 4 years ago (i5/8GB/256GB) and it is by far the best investment in technology I have made. Ultra reliable, amazing battery life, light weight and nice to type on. At home it's plugged into a monitor/kb/mouse like a desktop. I like the tight software and hardware integration, which extends to an iPhone and iPad.


You win some you lose some. My 2018 MBA had to go back and ended up being returned permanently. I’m considering risking it again on an M1 MBA though.


I have a 2013 13” MBP that I have not being nice to (to put it mildly). It’s working great.

I have a 2013 15” MBP that I babied and kept in immaculate condition. It also worked great until some RAM failed a couple of weeks ago. It’s a brick now


If Apple were to release low level documentation and source code for hardware it considers obsolete to help developers support it, it would not effect their business other than getting a lot of goodwill.


it may not affect them at a pennies-in-billions level, but those pennies are owned by a few patent holders whose heirs will profit many times over in the foreseeable future


why? I doubt those obsolete chipsets will be used by anyone else, or anyone will be particularly motivated to shell out a lot of money for access to schematics.


they are assets in the big machine, being sold or open sourced would reduce them and their associated incomes to nothing


What patents do you have in mind? If things are patented anyway, it wouldn't hurt to document them.


Apple started out open. The Apple II has 7 extension slots and loads of peripherals available. It was also user serviceable. This is what Steve Wozniak wanted, and it worked, it was a smashing success. Steve Jobs, on the other hand, had another vision for the company, where Apple would control the user experience. The Macintosh Plus had just two extension slots, and users couldn't open the case, you needed a special extra long screwdriver.


Exactly; Steve Jobs envisioned a closed architecture for the Mac. Thankfully for Mac users who wanted a more open experience, Apple released the Macintosh II in 1987, which was styled similarly to PCs and had six NuBus expansion slots. From then until the release of the cylindrical Mac Pro in 2013, Apple always had Macs with expansion slots in its lineup. From 2013 to 2019 Apple didn’t sell Macs with internal expansion slots, but Apple resumed selling internally expandable Macs once the current “cheese grater” Mac Pro was released, albeit at a significantly higher price point compared to the 2006-12 cheese grater Mac Pro.


On the contrary.

There was nothing open about NuBus, Quicktime, QuickDraw, QuickDraw 3D,....

Apple's platforms like every other 16 bit computer, with exception of IBM PC clones, was always its own eco-system.


Another way to look at it:

Microsoft forced PC buyers to use their software by making deals with OEMs to preinstall it on every PC, hiding the cost of the software from the consumer. Most consumers did not purchase a PC with no software installed, and then purchase a license to Windows separately; the software and license came with the computer.

There are probably more similarities between Apple and Microsoft than there are differences, however tempting it may be to focus on the differences.

People love to criticise the RPi. It has its flaws and shortcomings. Nevertheless, it is a rare example of a computer that does not come with an "OS" preinstalled. Buyers can choose from a variety of OS and make their own bootable SD cards.


The raspberry pi does have an OS preinstalled that users cant remove, which is why its so hard to get full support for the basic linux stack on there. The GPU has a proprietary low level OS/firmware blob that handles basic system functions and loading linux and starting the CPU once all that is done and is required for the board to start. This is a big part of why Armbian/Ubuntu dont have full support yet for example. Its not impossible but its weird and complex for OS developers and one of the strengths of the alternative boards, which can generally boot and run a full linux stack with hardware support for everything on the board.


> This is a big part of why Armbian/Ubuntu dont have full support yet for example.

Care to outline what you mean by that? Ubuntu has official support for most of the newest RPi models[1].

In fact, I’m running Ubuntu 20.04 on my RPi 3B+ right now.

How is it not fully supported? What am I missing?

[1] https://ubuntu.com/download/raspberry-pi


A bunch of hardware acceleration (primarily involving the GPU because of the weird it-runs-its-own-mini-OS situation) are either not currently implemented or can only be used via a rather hacky kludge. Once the GPU boots and passes off execution to the Arm cpu that works mostly as intended, but talking to the GPU again and getting it to do heavy lifting is still a work in progress.


What's the best RPi alternative in your opinion?


I drool over the Odroid N2+, but mostly I just use a Raspberry Pi. 32 bit Raspbian is fine for me and most of it works fine (although the fake kernel mode switching video driver is less then ideal) and i don't really care to lose the huge community for when I need to figure out an issue. Makes figuring out problems a google search solution and not a debug probe solution.


I think steps are being made to open up the architecture.

Documentation of a lot of it has been published, but only a partial drivers have been developed by the community.

for example, I think mesa has some opengl hardware acceleration and I know kodi does hardware video decoding. I also believe there's some work on u-boot.


It was not Microsoft, it was IBM that “gave us” commoditized hardware.


It was MS who forced it big; they wanted standardisation on both professional and 'home computers'; IBM PC (clones) and MSX[0] respectively, both running MS software. MSX failed, but the idea was the same; a hardware standard everyone would adhere to and MS would have the software for. MS was a huge factor in making that happen; no-one knows what would've happened if they would not have done that.

[0] https://en.wikipedia.org/wiki/MSX


I don’t think Microsoft had anything to do with the PC hardware becoming open. IBM chose commodity hardware. https://en.wikipedia.org/wiki/IBM_Personal_Computer#History:

“The idea of acquiring Atari was considered, but rejected in favor of a proposal by Lowe that by forming an independent internal working group and abandoning all traditional IBM methods, a design could be delivered within a year, and a prototype within 30 days. The prototype worked poorly, but was presented with a detailed business plan which proposed that the new computer have an open architecture, use non-proprietary components and software, and be sold through retail stores, all contrary to IBM practice”

That was before even the choice for a CPU was made (makes me wonder how prototypical that prototype was), so I don’t see how Microsoft would have been involved at the time.


When IBM lost control of the PC market when Compaq and other clones started coming out and their proprietary PS/2 PCs were rejected by the market, the power to define what a PC was fell into the hands of Microsoft, who still holds it today through their Windows hardware certification program. It is Microsoft who is keeping PC hardware open, albeit probably more because of fear of more anti-trust scrutiny than any altruistic motives.


Reality was more that MS /benefited/ from the IBM-PC clone market rather than MS /caused/ the IBM-PC clone market.

IBM gets some credit, for making the IBM-PC with commodity components, which set the stage. But it was the clone makers (with Compaq being first, but certainly not last) that ultimately caused the standardization around the IBM-PC style systems. And the clone makers were driven by the fact that, at the time, the market (esp. businesses supplying their users) wanted to be compatible with the IBM-PC, while saving costs over buying an actual IBM-PC from IBM. MS benefited from the explosion in sales of IBM-PC clones by being the OS provider for the IBM-PC, so as a clone maker, to be fully compatible, you also needed MS's OS on your clone.

The standardization process occurred some years later once the market had clearly moved towards the IBM-PC architecture. And MS likely had some hand in guiding that process, given their monopoly at that time in the OS that every clone maker wanted to use. But by the time MS was powerful enough to begin any guiding (or "forcing") the market itself had already "standardized" because of the huge sales potential of being "IBM compatible".


I agree that IBM gave us the PC with the BIOS listing and open hardware specifications.

(and they tried to close the barn door with the ps/2, microchannel and os/2 but failed pretty miserably)

Meanwhile Microsoft with its non-exclusive software agreement courted hardware vendors and made MS-DOS and soon windows work with a multitude of hardware products. It fostered hardware competition and drove down the price.


No, IBM would have gladly prevented, they even tried to fix Compaq's success by releasing the PS/2 with MCA architecture, they just failed to turn the market around.


Apple's marketing approach is primarily to view hardware and software as inseparable parts of the same product. Their main differentiator in the market is their ability to control the end-user experience to a greater degree than their competitors.

They are probably of the opinion that opening up software would decrease their target customer satisfaction and subsequently decrease sales.


> Apple as a hardware company

Apple is now (primarily) a software-service company, and from that point of view, a locked-up platform makes a lot of business sense (unfortunately). Selling hardware is only the first step in locking customers into their service-ecosystem. In this new Apple world, app-developers are essentially Uber/Lyft-style gig-workers, not independent businesses.


> Apple is now (primarily) a software-service company

They like to tell everyone that, but it's still very much a lie. More than 50% of their revenues come straight from iPhone hardware sales. Services are barely under 18%, and that includes absolutely everything they can throw in there (icloud, appstore, etc). Everything else is hardware.

Apple is a hardware company that is desperately trying to ensure their future when, inevitably, they'll get a few iPhones wrong and consumers will move on. It's a bit like Persian Gulf countries investing in airlines and anything else to ensure they'll have a future when oil runs out.


Apple has made many hardware mistakes and will continue to do so. All their software is designed to sell hardware; they rarely support non-Apple hardware.


IBM did that by licensing the PC "clone" design, not Microsoft. Microsoft added the lockdown layer on top.


They did not license anything, Compaq stole it from them.


Business is conservative. If it stops making bucketloads of money with the current formula then they will change. Otherwise expect the same for as long as it works for them.


Nobody who actually understands Apple's business and how it works "would think" this.


Ok, but instead of putting down someone who knows less, please share some of what you know so the rest of us can learn something.

https://news.ycombinator.com/newsguidelines.html

(a recent longer explanation of this principle, for anyone who cares, is https://news.ycombinator.com/item?id=25130956)


You're right. Again.


Yeah, maybe. They are driving the price of software to zero which helps ios and macos device sales. But I think if they work hard to close things they might end up with a bigger part of a smaller pie. I don't know, maybe they don't need help from people on the outside and can do it all themselves.


Apple is in the business of selling systems that work to end users. Unfortunately, the only way that they can provide that assurance is total control over both the hardware and the software. In fact I suspect that within ten or so years, Apple will eliminate the final dependency -- on NVIDIA -- and migrate the Mac (and everything else) to a custom ISA.


I don't think apple depends on very many companies at all anymore. Not intel, nvidia (or arm), or amd.

I would say they have transitioned to depend more on folks like samsung and tsmc.


I don’t think you’ve been paying attention.

Apple just released their first laptops with custom silicon cpu/gpu, declaring their independence from Intel.

They dropped Nvidia ten years ago.


What dependency on Nvidia do Apple have?

From memory, Apple haven't used Nvidia in any of their products in years due to bad Nvidia behaviour ages ago.


They are an ARM licensee. ARM was acquired by NVIDIA. If there's bad blood between NVIDIA and Apple, all the more reason for Apple to drop ARM and go with a custom, Apple-designed ISA.


Apple has a lifetime license for ARM. There is nothing Nvidia can do to Apple on this front or any other.

But more than that, Nvidia needs Apple. Apple selling ARM macs to developers means that all developer tool chains are being updated to support ARM. This is vital to Nvidia’s plans of ARM server chips. It would never have been a reality without the M1 or something like it.


ARM is a joint-venture between Apple and Acorn, each had 43% of the capital back in 1996. Apple has a lifetime architecture license, they couldn't care less about who own ARM...


Not sure Apple would ever go completely custom, their MO when they need a component is to try to find something existing first (webkit, llvm) and adapt it. If there was bad blood they’d sooner adopt RISCV.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: