Hacker News new | past | comments | ask | show | jobs | submit login

Of course, the other plausible explanation is that (at least at the time) Weird Al didn't know the difference between RAM and hard drive space (especially likely, because he's talking about defragging a hard drive). In fact, I'd be surprised if that wasn't what he meant. I will, however, leave room for him knowing the difference, but intentionally being absurd (it is Weird Al, after all).

That problem (referring to things that are not conventional fast, volatile random access memory as RAM) seems to have only gotten worse in the last twenty years - exacerbated, I believe by the increased use of flash technology (SSDs, etc - which have, admittedly blurred the lines some).

It also doesn't help that smart phone and tablet manufacturers/advertisers have long insisted on referring to the device's long-term storage (also flash tech) as "memory". I doubt that will ever stop bugging me, but I've learned to live with it, albeit curmudgeonly.

Whippersnappers! :-D




I’m going to have to stan for Weird Al here and say that there’s basically 0% chance that he didn’t know the difference between RAM and hard drive space. He’s actually quite meticulous in his songwriting and pays a lot of attention to details. And with a mostly (white and) nerdy (ba-dum-tssshhh!) audience he knows he’d never hear the end of it if he screws up. Must be quite the motivator to get things right.


He even mentions hard drive space separately, as something comparable to removable storage:

You could back up your whole hard drive on a floppy diskette

You're the biggest joke on the Internet

I can only conclude that he did understand the distinction between RAM and hard drive space. The song is full of jokes that show a deep understanding of the topic. Indeed I don't find any nits to pick.


Also, taken in context, telling somebody their whole HDD fits in a floppy is entirely consistent with bragging about your entirely unreal 100 gigs of RAM.


Just to be clear, I did not in any way mean this as disparaging toward Weird Al. I've been a fan of his music for decades.

But as I said elsewhere in this thread, I have been surprised in the past by people I would have described as fairly tech-savvy, who still called hard drive space "memory".

However, if the two phrases are related (as opposed to just being adjacent in the song), at this point I'd guess Al does probably know the difference, and the relationship is intended to be intentionally over-the-top absurd bragging.


> who still called hard drive space "memory"

I mean, “memory” in the computer sense is by analogy t memory in the brain sense, which is used for both short and long term storage (and I’d argue more commonly for long term storage). It therefore doesn’t seem unreasonable to me for people to call a computers long term storage it’s “memory”. The function of a hard drive is to remember things.


He didn’t say “memory” - he said RAM! Why? Because it’s funnier and it rhymes with spam! Plus “hard drive space” is three syllables. It’s a song not a README, sheesh


It’s a SINGME not a README


I like the idea of adding SINGME files to my repos from now on


I consider myself pretty tech savvy and call hard drives memory. HDDs exist in the middle of a memory hierarchy stretching from registers to cloud storage.


You're not wrong, they are a kind of memory. I think more people would understand you if you call RAM memory and HDDs and SDDs storage though.

It goes the other way too, RAM is a type of storage, which is the terminology IBM uses (and I think has used since the 60s or so).


Memory is where you store data


I don’t really care if the unwashed masses understand me.


Do you call them RAM, though?


Never thought about it before, but I guess SSDs are random access, so it would be technically correct to call them RAM, and that’s my favorite kind of correct.


Jumping on the nitpick train: SSDs are less randomly write-access memory because of the requirement to explicitly erase an entire block before you can rewrite it. RAM doesn't have this constraint. It has other constraints, like writing and refreshing rows at a time, but combined with the difference in erase cycles causing wear in flash, you do end up with a significant difference in just how randomly you can access these memories.


SRAM is the only type of memory offering true random access. DRAM can suffer so much on pure random access that completely rewriting your algorithm to have mostly sequential access to 20x more memory still ends up 4x faster (lean miner vs mean miner in [1]).

[1] https://github.com/tromp/cuckoo


Well as long as we are nitpicking: random access does not say anything about the block size - you're not going to be writing individual random bits whith any kind of memory. And random access also does not say that all blocks have the same access time, only that the time does not depend on what you have accessed previously. In conclusion, an array of spinning rust drives is random access memory.


My mother calls the entire computer "hard drive", leftover from old newspapers in the 90'ies.


Yep. Most faculty/staff where I work either call a desktop computer a "hard drive" or a "CPU".


RAM is just cache for your drive. And these days your drive is just cache for the internet. It’s all memory, one big memory.


Most of the time, my RAM directly caches the internet. (Ie when I watch YouTube or browse the web, stuff doesn't typically go on disk.)


RAM is NOT cache for your drive. It's the active work space where the CPU tracks all the bits and bobs of executing code - state, variables, etc.

Swap space on disk is a cache of RAM, if you want to go down that path - but RAM and disk support entirely different purposes.


Or, swap (and hibernate) are the RAM cache flushing to disk. ;)


> I have been surprised in the past by people I would have described as fairly tech-savvy, who still called hard drive space "memory".

That's not really incorrect, though. RAM is random access memory, as opposed to memory that has other access patterns.

It's less common these days, but among people who learned to use computers in the 80s, you'll still hear some use "memory" to refer to storage that isn't RAM.


Windows 95/98 labelled HDD space as "Memory" so it's unsurprising that that's common.

That's why I always had to make sure to say "RAM" when I meant memory in the 90s.


>> Windows 95/98 labelled HDD space as "Memory" so it's unsurprising that that's common.

Where did it do that?


Calling a hard drive “memory” can also be left over from very old computers which blurred the distinction between transient and permanent/stable memory.


| basically 0% chance

lost credibility here


The rest of the lyrics make it clear that Weird Al knows exactly what he's talking about.

https://genius.com/Weird-al-yankovic-its-all-about-the-penti...


In this particular song, His Weirdness offers a few timeless observations:

> Upgrade my system at least twice a day

> What kinda chip you got in there, a Dorito?

> You say you've had your desktop for over a week? > Throw that junk away, man, it's an antique!


yeah, the Usenet reference had to have been obscure even back then. and beta-testing OSes back then would been.. what? Windows Neptune and Copeland?


At the time, I had used/tried out the following OSes

  - BeOS 4.5 (including some beta versions)
  - BeOS 5 (including some beta versions)
  - Windows NT 4.0
  - OS2/Warp
  - Windows 95 (and it's service releases) (including beta versions)
  - Windows 98 (and it's service releases) (including beta versions)
  - PC-DOS
  - MS-DOS with DosShell
  - MS-DOS with Windows 3.11
  - Whatever old version of MacOS was on the school computers
  - Slackware
(might have some dates wrong)

I was also on mIRC and downloading from newsgroups regularly.

I think many ISP's would give you guides about using email/newsgroups back then as those services were considered required for an ISP. TUCOWS was super popular for this newfangled WinSOCK software (TUCOWS stands for The Ultimate Collection Of WinSOCK Software). I remember testing how fast the first consumer cable internet connections were by downloading from them.

You are right for most people that stuff was probably obscure.


I had been exposed to some of those, plus SunOS, Solaris, VMS, AmigaOS, QNX, Red Hat and Debian.

There was no such thing as a OS-monoculture.


At that time you'd have been mercilessly about mIRC. It's a client program, and you'd be on IRCnet, EFnet or QuakeNet, and the the last one would earn you endless disrespect from IRCnet veterans.


Today, Usenet is obscure. Late 90's was right around when Usenet was at its peak. ISPs back then advertised newsgroups as a feature: "we have a full newsfeed", that sort of thing. I worked at one that built a server with a whopping 9 gigabyte hard drive just for the NNTP feed! That was pretty cutting edge back in 1995.


Among the general MTV-watching audience of the 90’s, Usenet was very obscure.


I subscribed to newzbin we'll into the mid 2010's.


Eh; Debian has been around since '93. I'm sure there were dozens of non-mainstream OS's fighting for nerds' attention by 1999.


Weird Al has an engineering degree from Cal Poly


To be specific, he has an architecture degree from Cal Poly.


Maybe so, but I don't see anything in the lyrics that could not be taken from a friend who's "really good with computers".

At this point, however, I'm willing to give Al the benefit of the doubt. It certainly would be "in character" (for lack of a better term), for him to be knowledgeable enough to know the difference. But I have been surprised by others on this particular point in the past.


> I will, however, leave room for him knowing the difference, but intentionally being absurd (it is Weird Al, after all).

This isn't even intentional absurdity. The theme of the song is bragging. Here are some other lyrics.

    I'm down with Bill Gates
    I call him "Money" for short
    I phone him up at home
    and I make him do my tech support

    Your laptop is a month old? Well that's great,
    if you could use a nice heavy paperweight

    Installed a T1 line in my house

    Upgrade my system at least twice a day
The line about having 100 gigabytes of RAM is completely in keeping with every other part of the song. There's no more reason to think Weird Al might not have known what RAM was than there is reason to believe he didn't know that having Bill Gates do your tech support was unrealistic, or that PCs are rarely upgraded more than once a day.


I think the main point is all the other (technical) brags are quite quaintly outdated (T1 lines are glacial by todays standard, while interestingly the lines about how quickly hardware becomes obsolete have become less accurate over time), on the other hand 100GB of RAM is still quite extreme.


This was an active topic of debate back in the days when people still relied on modems. T1 gives you 1.544 mpbs "bandwidth" but the fact that it's "guaranteed" bandwidth means it should be fast enough for anything you'd need to do as an individual user. If you had your own private T1 line all to yourself, the latency should feel the same as being on the same local LAN as whatever service you're trying to access. Even a 56k modem still had a small but noticeable latency, especially if you're doing command-line where you expect echo-back on each character you type.

People don't really understand the speed vs. bandwidth debate, but they do know the psychological difference when latency is low enough to be unnoticeable.


IIRC the point about T1 lines are their availability + uptime + bandwidth guarantees, but yea they're not so hot anymore from the ubiquity of other high performance networking alternatives.


I don't know how extreme I'd consider it; 64GB of RAM is relatively common and 128GB is just one step up from that.


I'd say it differently: 64Gb ram is easily purchasable and 128 isn't much harder - just $$$.

But to my knowledge not many purchase it. As the article says, the defaults for new items on the main vendors are still 8 or 16. I suspect most could be happy with 32GB for the next 3 years but that isn't a default.


64 Gb is a high-end desktop these days. But most desktops aren't high-end, and most PCs these days aren't desktops.


> There's no more reason to think Weird Al might not have known what RAM was than there is reason to believe he didn't know that having Bill Gates do your tech support was unrealistic

I understand why you might argue that, but I've been surprised in the past by people who were fairly-well-versed on computers (no pun intended), but still called hard drive space "memory".

I should go listen to that section of the song again (like I said elsewhere, it isn't one of my favorites of his). By intentional absurdity, I meant that I could see Al intending this to be a case of bragging in an absurd way "Defraggin' my hard drive for thrills, I got me a hundred gigabytes of RAM".

But now that I noticed the "I" in the middle there (I had missed it before), I'm guessing I was wrong in thinking the lines could be cause-and-effect (which would be the source of the absurdity).


> but still called hard drive space "memory".

Well, it is secondary memory. Besides, absolutely, it is memory. It's just not the main memory in a relative context about your computer.

In fact, "storage" and "memory" mean about the same thing. And since it's not technically the main or primary memory of our computers anymore, and RAM is actually an implementation detail that all of the in-machine memory shares, I'm not sure I have a good pedantic name for it.


> RAM is actually an implementation detail that all of the in-machine memory shares

Nope. A pagefile is random-access memory backed by your sequential-access hard drive (or, hey, maybe your random-access SSD) instead of your "RAM".


I mean... My computer has a device called RAM that communicates with the north bridge by a protocol where sequential readings have lower latency than random ones. It also has a device called disk (or sometimes drive), that has nothing shaped like a disk (and is never loaded), and communicates with the north bridge by a protocol where sequential readings have lower latency than random ones.

At the CPU side, there is an in-package device that communicates with the memory controller by a protocol where sequential readings have lower latency than random ones. It communicates with an in-die device, that finally communicates with the CPU with a protocol that provides random access. That last one is obviously not called "RAM".


> I'm guessing I was wrong in thinking the lines could be cause-and-effect

This is the root cause of why everyone is arguing with you. For some reason, TFA quotes the defrag line and the RAM line as if they are somehow paired, which you've also latched onto, but it makes no sense to consider them as a pair. Many lines in this song are to be interpreted as their own complete thought. And if we really want to pair up consecutive lines, wouldn't we use lines that rhyme to do so?

...

Nine to five, chillin' at Hewlett Packard? // Workin' at a desk with a dumb little placard?

Yeah, payin' the bills with my mad programming skills // Defraggin' my hard drive for thrills

I got me a hundred gigabytes of RAM // I never feed trolls and I don't read spam

Installed a T1 line in my house // Always at my PC, double-clickin' on my mizouse

...


Yeah, that's fair.

As I said before, this is not one of my favorite Weird Al songs (many of which I could sing along with word-for-word from memory). I was never crazy about the song, so I am not surprised that I didn't know the surrounding context (which the original article left out). Still, I don't think you can exclude the possibility of a cause-and-effect relationship spanning the middle of two rhyming lines (perhaps particularly, in rap music), but after seeing and hearing the lines in context, I don't think that was Al's intention here.


It's really not the users' fault. Most civilians understand that computers have to have some way to "remember" their files, but the fact that computers also need memory that "forgets" if the power goes off makes no sense to them.

It shouldn't make sense to us either; it's a ridiculous kludge resulting from the fact that we've never figured out how to make memory fast, dense, cheap, and nonvolatile at the same time.

Actually Intel did figure it out with Optane. Then they killed it because computer designers couldn't figure out how to build computers without the multilevel memory kludge. IMHO this is the single dumbest thing that happened in computer science in the last ten years.


My understanding is that the problems with Optane were a lot more complicated than that. @bcantrill and others talked about this on an episode of their Oxide and Friends Twitter space a few weeks ago. A written summary would be nice.


Thanks for the tip. I learned a lot listening to this [0].

My takeway: Optane was good technology that was killed by Intel's thoroughly wrong-headed marketing. It's cheaper than DRAM but more expensive than flash. And much faster than flash, so it makes the fastest SSDs. But those SSDs are too expensive for everybody except niche server farms who need the fastest possible storage. And that's not a big enough market to get the kinks worked out of a new physics technology and achieve economies of scale.

Intel thought they knew how to fix that: They sold Optane DIMMs to replace RAM. But they also refused to talk about how it worked, be truthful about the fact that it probably had a limited number of write cycles, or describe even one killer use case. So nobody wanted to trust it with mission-critical data.

Worst of all, Intel only made DIMMs for Intel processors that had the controller on the CPU die. ARM? AMD? RISC-V? No Optane DIMMs for you. This was the dealbreaker that made every designer say "Thanks Intel but I'm good with DRAM." As they said on the podcast, Intel wanted Optane to be both proprietary and ubiquitous, and those two things almost never go together. (Well obviously they do. See Apple for example. But the hosts were discussing fundamental enabling technology, not integrated systems.)

[0] https://podcasts.apple.com/us/podcast/rip-optane/id162593222...


> Most civilians understand that computers have to have some way to "remember" their files, but the fact that computers also need memory that "forgets" if the power goes off makes no sense to them.

Well, of course that makes no sense. It isn't true.

We use volatile memory because we do need low latency, and volatile memory is a cheap way to accomplish that. But the forgetting isn't a feature that we would miss if it went away. It's an antifeature that we work around because volatile memory is cheap.


I agree we would be fine if all memory was nonvolatile, as long as all the other properties like latency were preserved.

In terms of software robustness though, starting from scratch occasionally is a useful thing. Sure, ideally all our software would be correct and would have no way of getting into strange, broken or semi-broken states. In practice, I doubt we'll every get there. Heck, even biology does the same thing: the birth of child is basically a reboot that throws away the parent's accumulated semi-broken state.

We have built systems in software that try to be fully persistent, but they never caught on. I believe that's for a good reason.


It would take serious software changes before that became a benefit. If every unoptimized Electron app (but I repeat myself) were writing its memory leaks straight to permanent storage my computer would never work again.


> If every unoptimized Electron app (but I repeat myself) were writing its memory leaks straight to permanent storage my computer would never work again.

This is a catastrophic misunderstanding. I have no idea how you think a computer works, but if memory leaks are written to permanent storage, that will have no effect on anything. The difference between volatile and non-volatile memory is in whether the data is lost when the system loses power.

A memory leak has nothing at all to do with that question. A leak occurs when software at some level believes that it has released memory while software at another level believes that that memory is still allocated to the first software. If your Electron app leaks a bunch of memory, that memory will be reclaimed (or more literally, "recognized as unused") when the app is closed. That's true regardless of whether the memory in question could persist through a power outage. Leaks don't stay leaked because they touch the hard drive -- touching the hard drive is something memory leaks have done forever! They stay leaked because the software leaking them stays alive.


> The difference between volatile and non-volatile memory is in whether the data is lost when the system loses power.

I'm aware. This is a feature for me - I disable suspend/hibernate/resume functionality. I don't want hiberfile.sys taking up space (irrelevant in this scenario, I guess) and I certainly don't want programs to reopen themselves after a restart, especially if it was a crash. If all storage were nonvolatile, OSes would behave as though resuming from hibernate (S4) all the time.

> that memory will be reclaimed [. . .] when the app is closed.

Again, I'm aware. I'm glad you've never had any sort of crash or freeze that would prevent closing a program, but it does happen.

OSes would need to implement a sort of virtual cold boot to clear the right areas of memory, even after a BSOD or kernel panic. Probably wouldn't be that hard, but it would have to happen.


> Again, I'm aware.

Are you? Because according to your words in this more recent comment, your original comment was meaningless gibberish.

> If all storage were nonvolatile, OSes would behave as though resuming from hibernate (S4) all the time.

That isn't even possible to do. Thus, obviously, it would not be done.


> That isn't even possible to do.

What? Of course it has to have a first boot sometime, but past that S5 would no longer need to exist.


You could still have a restart "from scratch" feature in the OS. But persistent RAM could potentially mean the power dropping for a few seconds means you don't lose your session.


I used to explain it to clients as the difference between having your files nicely organized in cabinets or on shelves, and having the contents of several different files strewn over the desk. For people who like to keep a tidy desk this metaphor made immediate sense.


It explains some aspects and obscures some other ones.

It explains how the documents at arm reach on the desk are faster to access than the things in the cabinets. It obscures the fact that stuff on the desk dissapears if the power supply glitches even just a little.

In fact we invented these batteries with electronics which sense if the electricity is about to go out so the computer has time to carry the documents from the desk to the cabinets before they dissapear. And we think this is normal and even necessary in pro settings. (I’m talking about uninterrupted power supplies of course.)


A lot of people clear their desk before leaving the office, either because they're tidy and well organized or because they don't want to leave confidential info out where the cleaning staff could browse through it. It's not hard to extend the metaphor to a reboot of the computer being like the end of the workday.


on the desk/in the cabinets has worked as a metaphor for anyone I've had to explain it to.


See also "Wi-Fi" meaning any Internet access now. Or "screen saver" meaning the desktop wallpaper.


Just three days ago, professional streamer Hakos Baelz made this exact mistake.

...and then had to be told that the cable you plug into the back of a landline telephone isn't just a power connector. Sadly, I doubt we'll ever know how she thought telephones actually worked.


20mA current loop?


I think she was playing up most of her zoomerisms during that stream to play off Vesper's boomer character.

I think. But who knows.


The CPU is that box that the monitor and keyboard connect to. :-/


I could be wrong, but I think back in the day the computer monitor was the whole system for debugging and viewing the state of a running computer (which is why the debugger on an Apple II is called a monitor, for example) but now we just use it to mean the physical screen.


I'm not sure about that.

CRT monitor were a relatively late addition to terminal devices. Traditional terminals were hardcopy printers; they printed the output you received onto paper.


Internet is down! Umm no it’s fine. Shows me Facebook not working. Umm that’s just Facebook down. So the internet is down!

After a few rounds sure. The entire Internet is down. I’m going to go play an online game.


A pet peeve I've had to grow out of is the missing word "connection" when people say "The Internet is down".


So you don’t believe in technological solipsism?


The web server is down!

- Sales guy


I think you mean "the website is down!"

https://youtu.be/uRGljemfwUE


Huh? I have never encountered those.


My favorite one was somewhere on Reddit where someone bought a house and was asking how to rip out the Ethernet so they could install Wi-Fi.

Thankfully everyone was like "NO. Don't do that, here's how to install a wireless router to your ethernet setup"


I have. Often. It's probably not common in tech circles but "desktop", "wallpaper", and "screen saver" are often used interchangeably.

"Menu bar", "dock", "toolbar", "menu", and other similar terms are used more or less at random.

It's simply not common for the average user to know the names of UI components.


I was about to say that confusing the first two makes perfect sense because it's been a long time since the Desktop has had any real purpose beyond displaying a background image. Then I realised that the kind of people we're talking about probably have a shortcut for every single application they've ever installed on their desktop and have no idea how to delete them.


It's been an equally long period since screensavers had any actual purpose, being replaced by power saving features and screens without burn-in.


They're still somewhat useful with OLEDs, although even there turning the screen off is sometimes better.


It doesn't help that different brands of OS used different terms for the same thing.

The screen saver mixup makes no sense though. That can only mean one thing.


I'm clear on the first set but will cop to not having thought much about which of those is which in the second set.


I've seen the ones on the second set meaning completely different things from one application or environment to another. Even switching the meaning between themselves.


Your family must be very different than mine! (and coworkers too:)

I Have frequently heard phrase at work "you plug into here to get wifi". My family and in laws absolutely positively do not understand the difference between wifi and internet. In particular, the idea that they may be connected to wifi but not have internet is an alien, incomprehensible notion.

It depends who you hang around with. Computer professionals understand the difference. Most others do not as they don't need to - empirically, to majority of population, wifi IS internet (as opposed to cell phones, of course, which use "data":)


WiFi is the blue ‘e’ on my screensaver.


Aaaugh. You just summarized all the frustration of trying to help my family with technology.


My children complain the wifi is down when their ethernet cable is broken. They say that AFTER THEY TELL ME IT'S BROKEN. This is not just a meme, they should know better, and are very unhappy on WiFi, but still tend to call all internet Wi-Fi.


There was an incident years ago with a library offering wired Ethernet for laptops, and the sign said "wired wifi connection". I'm not sure it's really that common.


wifi but the wi stands for wired


And what does the "fi" stand for?


From: https://en.wikipedia.org/wiki/Wi-Fi

> The name Wi-Fi, commercially used at least as early as August 1999, was coined by the brand-consulting firm Interbrand. The Wi-Fi Alliance had hired Interbrand to create a name that was "a little catchier than 'IEEE 802.11b Direct Sequence'."

> The Wi-Fi Alliance used the advertising slogan "The Standard for Wireless Fidelity" for a short time after the brand name was created, and the Wi-Fi Alliance was also called the "Wireless Fidelity Alliance Inc" in some publications.

Elsewhere I've seen that it was chosen because of its similarity to HiFi - which is short for "High Fidelity" (regarding audio equipment):

https://en.wikipedia.org/wiki/High_fidelity

> High fidelity (often shortened to Hi-Fi or HiFi) is the high-quality reproduction of sound. It is important to audiophiles and home audio enthusiasts. Ideally, high-fidelity equipment has inaudible noise and distortion, and a flat (neutral, uncolored) frequency response within the human hearing range.

However, there seems to be a lot of debate as to whether the 'Fi' in WiFi was ever fully accepted to mean "Fidelity". So pretty much it seems marketing people liked the way "WiFi" sounded, so they used it.


WIres For Information


The same thing it stands for in "WiFi."


I hear stuff like "Comcast offers the fastest wifi for your gaming needs" anytime I am accidentally exposed to ads.


Assuming most people use the combo modem/router (WAP, technically, since this entire thread is an exercise in pedantry) provided by their ISP, this makes sense from a non-technical user's perspective, if Comcast always ships hardware with the latest Wifi spec. Of course, you need a whole mess of asterisks after that, because I don't think I own a single client device with Wifi 6E, plus network contention, time sharing, etc. Gamers should do what I do and plug a wire thingy from their computer to their Wifi thingy.


Ok, I found a hanger wire in my closet and have plugged one end to my computer power hole and the other into my wifi thingy power hole. Now what?

- someone in a much to near future


I've certainly heard both.

For a lot of people all their connected devices at home are wireless: smart devices, phones, many have laptops rather than desktops, tablets, …, and while out the connect to other wi-fi networks. It is easy to circulate WiFi and cellular data access, and if you don't use much or any wired networking at home all your normal network access is therefore WiFi.

Screensaver as wallpaper is more rare but I have heard it, and have done for some time (I know non-technical people who have used wallpaper and screen saver wrong for many years, going back to when fancy screen savers were more common than simply powering off, either calling both by one name or just using them randomly). More common these days though is people simply not knowing the term, except to confuse it with a physical screen protector.


Those lines are separate lines of separate couplets.

The lines around it are

"Paying the bills with my mad programming skills

Defragging my hard drive for thrills

Got me a hundred gigabytes of RAM

I never feed trolls and I don't read spam"

They're not really related to each other besides being right next to each other. The lines rhyme with other lines. Which, if you were trying to link ideas lyrically, is where you'd do it, on the rhyme.

But, the entire song is just a litany of various brags centered around technology. It is, like most of Al's work, pretty clever and knowledgeable. Not only of the source material, but of the subject presented.


Yeah, I guess it depends on whether you interpret them as separate.

It is possible to think of it in a cause-and-effect way "Defragging my hard drive for thrills got me a hundred gigabytes of RAM". Which, honestly, I could see Al saying that as a purposefully absurd statement (because the first could not cause the second, but people brag like that all of the time).

I will admit that, although it seems like it should be (given my profession), this is not one of my favorite Weird Al songs --and I've been a fan for decades. So while I have heard it many times, I can't remember the last time I listened to it.


There's really only one interpretation. There's a definite pause between the lines as well. This isn't really a matter of perspective. The only way I can concede that it's possible to read those lines in your way is not particularly flattering to you.


I still tell people that I'm double clickin' on my mizouse.


I'm 100% sure he knew what he was talking about. That line wouldn't have fit the song if he threw out a reasonable-sounding number for the 90s.

> Defragging my hard drive occasionally, when I start to notice drops in i/o performance

> I got me 32 megabytes of RAM

No thank you.


I don't think I've ever heard anyone confuse ram and mass storage by saying "ram".

All I think I've ever seen is anyone who says "ram" knows what ram is, and people who are fuzzy on it call it all "memory".


Yeah, I think that's mostly true.

However, I'm sure I have heard a few people use RAM when they meant hard drive space. Though, in fairness, those were probably friends/relatives who used computers but didn't know a lot about how they actually work.


Another plausible explanation is that `100 Gigabytes of RAM` is more catchy than `100 Megabytes of RAM`


"A thousand megabytes of RAM" has the same meter, and sounds more like an absurd brag than "A hundred gigabytes" even though it's smaller


Conversely, when it comes to mobile devices, RAM is still used in its original meaning, but somehow internal Flash storage became "ROM".


That one is logical, kinda, flash memory is [EEP]ROM.


Not once you drop the first part and it becomes just "read-only memory", though!


I think the poster is overthinking it.

It was probably just that a gigabyte sounds big, 100 sounds big. Therefore, 100 gigs


> especially likely, because he's talking about *defragging a hard drive*

Can someone explain why this is incorrect? I thought a hard drive is typically what one would defrag


The thing people are discussing is the 100GiB of RAM line. You wouldn’t typically defrag RAM (although that was a thing back then) and 100GiB is more indicative of HD space and not RAM. So the question is whether the two lines are related and did Weird Al say that defragging his disk freed up 100GiB of RAM (an impossible amount at the time) or 100 GiB of storage and got RAM and disk confused.


> and 100GiB is more indicative of HD space and not RAM

That's exactly what clearly points to it being about RAM.

> So the question is whether the two lines are related

There's no reason to think that they're related.


Or the fact that he's "doing it for thrills" implies he doesn't need to because he has mounted his entire file system as a RAM disk.


What should he have had been talking about defragging?

I don’t remember ever defragging anything but my hard drive.


> the other plausible explanation

It's not really plausible in the context of those lyrics.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: