Hacker News new | past | comments | ask | show | jobs | submit login
UI is a function of your organization (jim-nielsen.com)
173 points by damethos 9 months ago | hide | past | favorite | 113 comments



Sometimes, UIs are designed with unrealistic expectations and mismatches of how processes work in reality. This is maybe lost on today’s “native digital” generation, but anyone who worked at organizations a while back should understand this intuitively.

Back then, processes inside companies where paper-driven, a variation of “produce some kind of document, pass it along to another department, get a stamped copy/receipt to prove it’s been done”. I always use this example when designing architectures and UIs: if you couldn’t design the same process as paper being passed around, the design is missing something. You need to really grok the company structure and the domain to design something sensible.


Yes! And, the killer feature of paper that no digital UI has yet to fully capture is the margin.

If your business process doesn't have a form field for some data, but the person on the ground understands that it's valuable, it's naturally scribbled onto the margins, and then worked into the next version of the form.

If you're using nearly any digital UI, the feedback loop (if it exists at all) is a side channel.

More businesses should just use paper.


> And, the killer feature of paper that no digital UI has yet to fully capture is the margin.

This is THE reason why, in the year of our LORD 2024, physicians still HATE electronic medical records. Margins, sticky notes, and red ink you can circle anywhere you d~~n well please, not organized data easily parsed during discovery.


With the recent discussion on here about using a massive single text file as a scheduling and planning tool and todo list, and some developers who still carry around a paper notebook to capture notes and ideas, you would think this desire by users of our applications would be better understood.


>recent discussion on here about using a massive single text file as a scheduling and planning tool.

I can't find the discussion can you share the link


Here's the article: https://jeffhuang.com/productivity_text_file/ It's been submitted a few times for robust discussion, most recently a few days ago.


It's not 1:1, but from observing some math professors, tablets with a stylus are a partial solution to this.


Whether or not it is literally hand written is not important.

The thing that matters is that it is unstructured.


If only hospital systems didn’t force their physicians to use structured data. Administration loves structured data because it is an easy road to hoodwinking a jury into saying it’s professional negligence on part of a physician or plain bad luck on the patient’s part. It makes it easy for the losers who love to point out “see, it’s all right there in the medical record.”


You can support a structured way to add unstructured data. This can be useful a s a stopgap to understand where the structured data is inadequate, but I can become a hindrance if they don't use the structured data all.


This is precisely what MediaWiki tries to do, and it's why I love this software more than anything. You have templates and forms that the user can fill out, but the page is really a piece of paper - outside of the template you can write anything you want at all. With Cargo or Semantic MediaWiki you can even make a database out of the structured parts, and any record always includes the page from which that piece of data originated, so you can link to it to see the full context.

MediaWiki won't scale to support billions of records in a database, but for small to medium sized things imo it's perfect.


> More businesses should just use paper.

… or rather, digital processes shouldn’t throw away the benefits of paper.

E.g.: document databases instead of fixed-schema tabular (paper can accept more data than was anticipated), append-only databases instead of update-in-place (paper doesn’t forget). I’m a big fan of Datomic and similar databases because of that.


Problem is when businesses people want statistics and insights. Unstructured data allows not filling in some data but then data might be required to make analysis.

Agent taking free form notes might skip parts that are required for other means than job at hand and agent will optimize for work he thinks is important, some general statistics don’t bother him.

Problem is that user on the ground sometimes has to be forced into a scheme of bigger picture of processes.

On the other hand too much data is also being marked as required by business - where ad hoc note would do.

I see this with our customers where they ask us to implement bunch of fields as “must be filled in” just to start complaining that it takes too long to fill all required in - I did not make all of it mandatory on my own it was their business analysis.


And then people demanding free text fields be removed from various forms/b2b/market systems on the basis that they are rarely used… those easily contain all of the most valuable information in the system.

The world is highly cursed. The information system which accommodates this is highly blessed.


Why can’t this be done with a textarea for notes? Any chat system or ticket system has this.


Discoverability. I can't count the number of times I've had to ask phone support people to please read the notes that the last agent took. If you have a paper form, someone marking in the margins is immediately apparent.

Hypermedia didn't go far enough, in the sense that it overlooks how people form relationships with specific documents. In case of margin notes, individuals take an instance of a document template and mark it up to customize it. They recognized that the shape of the form was not the only shape the document could take and they knew that the form was an extension of their authority. So if it didn't make sense they changed it.

You can't really reproduce that experience with hypertext because there's no concept of "this document doesn't apply to this situation let me, the individual user, change it to communicate something unique." And because humans aren't reviewing any of these documents, there's nobody there to interpret it correctly even if you could. Essentially the Internet and web form design and replacement of humans with machines has involved a great abdication of authority from the front-line employees who process these forms. I would argue that it's becoming the downfall of our modern society. If a programmer or analyst couldn't envision your situation, nobody works at the company who can do anything about it.


Working for a company that has mixed paper/electronic documentation. I can’t overstate how useful it is to highlight, circle, and/or cross-out sections of a form. Also to see /what/ was crossed out to know the history of the document.

You might say, “well we can add these formatting features and make the form elements removable and…”. But if you know you know. Asking “normal people” to do all this in markup is an insta-fail. Most of us are caterers, not programmers.


AI can handle the mixed form... Perhaps now with LLMs we can just scan all the stuff with crossed out words and cicled conditions and have the ai make it searchable.

Then again we should always ask ourselves if a drug dealer would ever recommend someone stop doing drugs.


LLMs certainly can cope with disordered notes scribbled into the metaphorical margins, and OCR can usually turn literal scribbles in literal margins into something approximating what was written, buuuuuuuuut the AI we currently have is just good enough to be dangerous if you tried using it this way.

On the other hand, the last 15 months have been "if you don't like the state of the art AI, wait a week", so I may be out of date with that assessment.


If PDF editing tools were a little better and the average users' file name habits weren't ghoulish we could have digital paper.


Yeah, I too wanted to have paper that runs its own JS interpreter. (Yes, PDFs can embed executable code, because why not)


What naming habits would you like to see the average user adopt?


If I never see people prepending and appending seemingly random versions again, I'd be thrilled.

draft_document.pdf document.pdf final_document.pdf final_final_document.pdf document_final.pdf document_v2.pdf final_final_document_v3.pdf

Which one do I use?


I think this is in big part because we got save functionality wrong. The interaction you typically need is to save a snapshot into a different file, and continue to work on the current one. That is, the operation:

  File -> Save As "file-snapshot.doc"
  (you're now editing file-snapshot.doc)
  File -> Open "file.doc"
Should've been a single operation:

  File -> Save current as copy "file-snapshot.doc"
  (you continue to edit file.doc)
The software could even prepend the current date/time to the snapshot name by default, or something. Then, the simplest way of working would always keep "file.doc" as the most recent version, removing the need for the "_v2_final_really_final_final_final" suffixes.


While there is software with similar functionality, it is not universal enough for people to expect it to be there and make using it a habit.


This is why naively surfacing file systems to users is such a mistake. My gut feeling says that documents should always contain all of their revisions — with a time slider, and marginalia always enabled. However, organization of the documents seems like a critical issue. I can tell you, from my wife's/kiddos desktops, that organization means "one big pile". I strongly suspect that "one big pile" is the right answer, no matter how much it pains my programmer's heart.


Is it any different than real life?

Some people organize their entire lives into boxes and containers and labelled drawers. Some people have, basically, a big pile that they just shuffle through when they need something.

For example, I prefer a backpack that's one or two large compartments. Yet they sell backpacks with dozens of little slots. That's too much for me, but somebody thought people wanted it, so somebody must. Over organization doesn't work for me.


The old Word document format (*.doc) could, by accident, contain older versions of the document, as a side effect of how editing operations worked on the internal data structures (not purging all deleted content right away, to optimize editing speed, or something along those lines). This made for some embarrassing cases when third parties received documents with hidden content they weren’t supposed to know about.

And we all know of the numerous cases where text was blacked out in PDFs by adding black rectangles on top of it, but the text was still contained in the document.


The reason one big pile is the answer, at least for me, is that to do otherwise would be to bog myself down in minutiae of taxonomy. Being hierarchical worked when I was still in school (and somewhat works for work), but one large bucket with the ability to search by content—think iPhoto letting you search for text in photos—is how I run things now.


yyyy-mm-dd ReferenceIfAny Project NameOfDocument v1.pdf


So for a flat structure where filenames are sorted by time of creation?


I often think about my partner's pregnancy journey, when she was pregnant she was given a folder which had all sorts of forms in. For every appointment she went to, she'd bring the folder in and the staff would read the existing pages, and add their own (results of scans etc...)

Due to some minor complications, her book ended up being quite full, and we went to lots of places, some hospitals, some GPs (which are private companies), and they could all read and add to this book.

In the UK, you can rock up to any hospital with maternity facilities when you're in labour, and have your baby there. You bring your folder with you, and they can read the notes from previous doctors. Once the baby is born, you hand the notes in, and they are collated together and put on your NHS record.

I often think about making a digital version of that. How difficult it'd be to build something that worked across all the systems that different healthcare providers use, is reliable, and secure.

Sometimes, analog is best.


If it doesn't work with sticky notes, it doesn't work.

the rest is just scale.


> Back then, processes inside companies where paper-driven

Even today, the analogy I give for explaining APIs and integration sometimes involves tax-filing paperwork. The API for this system is that box 2 contains your full name, but the API for that system is a form where your last name in box 5A and your first name is in box 5B...


This is a variation of "you ship your org chart."

After hearing the outbound version of this truism, I discovered that the inbound version is also true; ie "you buy your org chart."

Anyone selling to v large organizations, corporate or government, will know what I mean. Bloated and dysfunctional orgs with eternal sales cycles buy from bloated and dysfunctional orgs that can survive eternal sales cycles, and which are willing to sell bloated and dysfunctional products to satisfy arbitrary criteria.

This is one of many reasons why startups have a hard time selling to very large orgs.


Conway’s Law from 1967.

[O]rganizations which design systems (in the broad sense used here) are constrained to produce designs which are copies of the communication structures of these organizations.

— Melvin E. Conway, How Do Committees Invent?


I've never really agreed with this assertion. What gives it the weight of being an axiom?


Communication flows create dependency points that limit what the design can be.

Consider an organization that designs broom handles. They have an industrial design department that selects the shape and material, and a market research department that selects the color and packaging to fit current trends. If the market research is purely downstream from the industrial design, their feedback can never affect the tactile properties of the product, thus limiting the range of possible designs.

Replace the physical product with an abstract system, and the same design dependency issues apply.


I would say it's very hard if not impossible to find a counterexample.


I have certainly worked on software that didn’t follow it (ownership of one thing was spread out across different teams). But it was certainly a huge pain to get anything done. So I think the reasons we typically follow Conways law are good


Conway's basic argument is pretty simple:

1. For any two software systems to interact, they must use some sort of agreed API or protocol.

2. In order to have that agreed API or protocol, the teams building those two systems must communicate in some way (even if it's just one-way communation where one team documents an API and the other consumes it).

3. Therefore, the communication structures of the software system will reflect the communication structures of the organization. So the software architecture mirrors the org chart.


I used to think Apple was the king of UIs.

They have long departed from quality and functionality; Jobs has a no nonsense approach where form followed functionality. If it functioned properly, the form was intuitive. Apple UI interfaces from the Jobs years were incredible. Nowadays, under forced yearly redesign policy, we’re reversed so far things are just hidden and awkward.

Kinda funny: you force it and it sucks.


> I used to think Apple was the king of UIs.

Heh. There was a time when you had to upgrade your OS by ... first going into iTunes. Maybe this is still the case.

https://support.apple.com/en-us/108964


You're talking about iOS. We can only judge these things in context. What was the first Android version where everyone could receive updates for the OS over-the-air, not going through a computer + USB port? And when did Apple support the same for iOS?


The best example of this is the macOS document proxies. They were a sensical representation of an icon you can drag and drop, now they are hidden under a hover of the title that stupidly animates it out.

Anybody would would make a critical productivity feature a hidden hover should be canned from a UX team. This choice was defended by Alan Dye.


Sorry, what are document proxies?

EDIT: Oh, is this it? [0]. I can't say I ever noticed that before (when it still existed)

0: https://osxdaily.com/2014/08/20/open-files-new-app-proxy-ico...


This is a feature i miss from macOS - perhaps the feature i miss most from macOS. Beyond drag-and-dropping you could also right click (or something similar, don't remember, i used macOS years ago) the icon and you'd get a popup menu with the directory hierarchy under which the file was stored - clicking on any of the menu options would open that directory in Finder.

This is the sort of integrated functionality you get when the OS, the application framework and the core applications that come with the OS are all written with each other in mind.


Maybe I am misunderstanding, but I just tried that flow with a Pages document and it still works in Sonoma. Drag the icon from the header, drop into an email. Now I'm curious what the old feature was.


What's changed is that newer versions of the OS hide the file proxy icon by default. You have to hover the cursor over the title for a second to see it, or to interact with it. Before, it was always visible, and ready to be clicked + dragged.


I think there’s an option somewhere to enable it, but yeah annoying it’s not default.


Ah maybe I changed this setting at some point and don't remember. It is always there for me, not only on hover.


Likewise, I think I enabled an option to always show it.

Its under System Settings → Accessibility → Display → Show window title icons


See: System Settings

System Preferences I could pretty intuitively find anything I was looking for. The newer System Settings is basically unusable outside of search


System preferences was a really poor design tbh. It just felt so easy because it had been there so long and everyone had the muscle memory.

The new one is not much better I agree. But the old one was objectively not great in terms of UX Design.

The main problem with the new one in my opinion is that they're shoehorning a mobile experience on the Mac. Something Apple does more and more (and not just Apple, gnome does it too)


The old System Preferences showed large, easily differentiable icons, front and center in a nice big grid. The new Settings app shows tiny, barely differentiable icons, hugging the left-most edge of the app. With the old System Preferences, I could immediately tell you where the "Display" and "Privacy & Security" icons could be found. In the new Settings app, I challenge you to tell me where they are in the list without looking.


>The main problem with the new one in my opinion is that they're shoehorning a mobile experience on the Mac. Something Apple does more and more (and not just Apple, gnome does it too)

A cost effectiveness measure over a well-thought out one, methinks


Does this include the use of cmd+spacebar? Personally, I feel like I haven't had to open it outside of my search.


My work laptop recently started having issues with the keyboard so I got a loaner, an unopened M1 Pro. It was still on an older version of the Mac OS, and I am so bummed that they want it back.

The UI that you're referring to is malpractice.


Honestly, I'm not even sure that this is a problem that can be solved. There are too many settings you could access, with no obvious single way to categorize them all. Search is probably the best you can do.


> I used to think Apple was the king of UIs.

The way I see it is that Apple excels at designing and building UI patterns and systems (I don't think anyone comes close), but they're terrible when it comes to the UI of actual software products.


For UI patterns and systems, do you have a post-Jobs example?

I'm thinking the word in that last sentence should be “excelled” (past tense).


I have one - The iPhone X's swipe navigation is brilliant. There is a little WebOS in there in multitasking gestures left and right, but it is such a straightforward system that works.


When we say “UI patterns or systems” we mean frameworks that provide high-quality defaults for all the software written for that system. Card/swipe navigation is a specific affordance of iOS — like Alt+Tab switching on desktop OSs — not a pattern or system. Further, as you mention, it was invented by Palm and copied by Apple eight years later.


So who's going to tell us that the Domino's Pizza Tracker is just a timer?

I don't eat there often, but when I have, the Pizza sits in QA for about 10 minutes, then is "out for delivery for about 15, despite the fact I could literally walk to the place in 15 minutes, and they sure as heck and walking my pizza to me.

Then there's the fact that they outright lie by saying it has been delivered, then turning up about 10 minutes later.

Between the fact it's not the best pizza and this dodgy behaviour, they pretty much make sure I don't eat from there more often than about quarterly.


It's a failure of incentives for accurate tracking.

It's the same reason that when you buy from McDonald's, the order shows on the screen as "ready for collection" then "collected", then disappears from the screen well before the food has hit the counter at all.

The staff are incentivised to just press everything through the system asap regardless of the actual status of the food. They're presumably performance-measured against targets and aren't punished for just checking that everything went through quickly regardless of the reality.

Domino's are particularly bad (or noticeably bad) for it.

It's an important lesson about remote top down control and a failure mode of JIT systems. I've long wanted to prepare a proper blog post about this exact phenomenon using Domino's and McDonald's as examples, but I haven't put in the effort to collect the right evidence to fully understand the negative effects of IT systems misrepresenting reality.


If you've ever been told to wait in the parking lot at a fast food drive through, it's because the store has a metric on dwell time and throughput that they're optimising for.


I've had to do this with absolutely no one else behind me and no one in front of me.

It's a funny feeling, watching the machine at work


Yes this is really dumb at McDonald's. Very confusing. I'm surprised nobody checks for this practice.

But in my area they now do table delivery so I tend to use that.

Ps: it's really funny to see McDonald's proudly advertising table service as if it's some cool new thing they invented :)


I think eventually they should.

It will cause a real problem if, for example, operations engineers try to make optimisations based on the data.

For a toy example, you could imagine someone might analyze the wait times and determine that the burger frying is on the critical path. Kitchens could be instructed to add an additional person to frying burgers while reducing the headcount at the drinks station. ( You can probably tell I haven't worked in a McDonald's kitchen, but bare with the example. )

However there is a hidden reality given the data was complete fiction, and it could be that the drinks were in reality on the critical path, and this intervention which a computer model predicts will boost throughput will actually harm it.

Of course the reality of operations research will be a lot more nuanced and subtle than that, but the conclusion that fake data will lead to expensive incorrect interventions and suboptimal optimisers stands.

There is also reputation to consider. If McDonald's gets a reputation as somewhere people avoid because they are put off by the broken system, then they should be proactively working out why.

It may well already show up in satisfaction surveys that people are put off by "the computer system", but it may be misattributed to the ordering UX rather than the complete package that includes the broken tracking system, fundamentally broken by misaligned incentives.


FYI the drink station is automated. When a person places an order and it has a soda, the soda machine actually has an automatic chute that drops the correct size cup into a rotating set of cupholders and dispenses the appropriate soda so the humans can be focused on preparing the rest of the order. When it comes time to finalize the order the soda is already in the cup ready to go, just add a lid and include it as part of the order.

[1]:https://www.youtube.com/watch?v=akv4vSXa5a4


Here in Europe they don't even add the lid anymore if you dine in. Nor do they provide a straw for diners-in (but you can still request one - I always do because the ice hurts my teeth).


Are you referring to those new fancy French reusable containers created due to new regulations? People were going crazy over them on TikTok because of how cool they looked. What do you think of them?


If you go observe the restaurant (or work there for some time) you'll notice that these failures generally occur during crunch time because the screens that instruct what is in the order become overfilled with too many orders. Nevertheless, if you are present you can observe the build process of your food and see that typically it is generally within the correct time window.

>The staff are incentivised to just press everything through the system asap regardless of the actual status of the food. They're presumably performance-measured against targets and aren't punished for just checking that everything went through quickly regardless of the reality.

But at the end they are still required to deliver the food or else there will be worse consequences. It is just not realistic to expect perfection in a noisy system where you can't exactly predict how many people will come during a busy hour. Mcdonalds is trying to improve with their massive data collection operations. Things such as reading your license plates in the drive through are not only just a profit making scheme: they help predict analytics on what you may order and the goal is to improve wait times during busy hours.


Yup. I pickup my orders from Domino's quite regularly and it the tracker is reliable. The only thing that does happen is that they sometimes do mark it "Ready for pickup" while someone still needs to take the pizza from the oven and put it into a box. So I sometimes wait a minute seconds at the counter for that to happen.


I worked at McDonald's briefly 15 years ago. Every single person bagging would mark an order as completed when someone started working on it and then use the recall button to see what the order was.


Exactly the same as my experience working at one circa 2007 or so. It was something the managers would explicitly instruct the line workers to do.


We used to use an app from a startup for local orders that would always say 17 minutes to pickup. They couldn't estimate the time, so just used a number that didn't look like it was hard coded. Took some time for us to realize the lie and stop trusting it


The Dominoes operation is pretty digitized. If its 15 minutes away you can go and watch the order made live.

As soon as the order comes in, it pops up on their screen. Is the restaurant busy? That would affect the time.

From there, the person assembles it (you can usually see this from the order counter) and then he pops it into the oven with the constantly moving rack.

This part is limited by the speed of the rack so there is a lower time bound where you cannot go under or your pizza is uncocked. Asked for well done? Well it goes back into the oven after coming out.

If you are there you eliminate the time from when the pizza comes out and it waits for the delivery driver to pick it up and deliver it. He could be out delivering something else or there could be traffic on the way to you.

You can walk in, place the order, and pay..then watch the order process begin in front of you and get the best possible pizza.


Many people refer to Conway's law here, but I'd argue that this law is true only for the static organization design, where structure is rigid and never changes. This law and and the title of the link hide much more important dependency: organization is a function of business requirements. System design is a function of them too. It MAY happen that organization is designed first, but it is not necessarily the case. Organization changes happen all the time and systems tend to stay during those changes. Many companies have subdivisions organized around the customer journey or certain product topics, e.g. Acquisition tribe or B2B business unit. Those structures work very well and do not reflect the Conway's law (system architecture required in this case is often org-wide and modular, requiring all org units to follow the same design approach).


Conway's law would still apply in a company where the org chart is changing. Obviously old systems don't disappear in a reorg, but new ones will reflect the communication structures at the time that they were built.

In companies that go through frequent reorgs, you'll often see a lot of "scar tissue," where you can tell that one set of services date back to a particular epoch. For example, these services are from a different time when the company tried to enter a new market, and here's a bunch of messy microservices from when the company added a bunch of developers after the Series C and the engineering processes failed to keep up. And along the way, you'll see a bunch of half-finished migrations that require both the old and new systems to be maintained simultaneously.

In your example, if that org is adapted to frequently shifting team boundaries with some central top-down architectural authority (CTO, architect, committee, whatever), then Conway's law would predict exactly the kind of modular architecture with a "shared design approach" that you describe.


I spent a fair bit of my career working with a piece of software which has evolved across a number of different operating systems, and, one presumes, development teams.

Among the reasons I refer to myself, tongue in cheek, as a software anthropologist, is what I'd recognised from that software: it bears the marks of the major platforms it evolved through: IBM mainframes, VMS, Unix, MS Windows, and ultimately an Apple Macintosh variant (though principally the tool now seems to be PC-oriented).

Similarly, on Unix, one can often identify applications' origins based on their toolkits and widgets: Athena Xaw widgets, Motif, VUE, KDE, GNOME, etc. Though often criticised for the lack of consistency, I'm aware that different tools reveal their affordances by the toolkits (and GUI presentation and behaviour) they reveal. Similarly BSD vs. GNU command-line tools, and various other command origins. Particularly notably, the 'dd' command, which comes from, and borrows syntax from, IBM TCL, a mainframe environment.

Digging deeper, Unix's origins in Multics, the B programming language, etc., are also apparent.

Again, this is somewhat confounding tools and teams, though I'd suggest that the principles are similar and related. The upshot is that past decisions get layered into software, with the oldest layers typically closest to the core.


When I look at a nightmare scenario like Salesforce, I am inclined to agree with you.


>but new ones will reflect the communication structures at the time that they were built

It is correlation vs causality question. Conway law may actually be the former rather than the latter.


This. Conways law is about the actual organization, not the orgchart.


It's interesting to think of this in conjunction with the Peter Principle. In any org that's existed long enough, you will have a "crust" of people who have been promoted until they're too incompetent for their position. If long enough time elapses, these incumbents will account for most of the decision makers in the org chart, from the root on down. Given how incentives and motivations change with seniority, it's not a far stretch to assume that they will want to remain where they are - cannot be promoted any further, and would rather not lose current status.

I know it's just a little thought exercise, but I see enough of that reflected in the real world to pause and consider. I would wager that in majority of cases, organizational structure does indeed remain static, at least in orgs that live on long enough timescales.


Conway didn’t discuss how organizations change over time, but if you look for them you can see the results in the systems around you.

Casey Muratori has some great examples in his video on the subject: https://www.youtube.com/watch?v=5IUj1EZwpJY


Steven Sinofsky wrote about this in the context of the early days of Microsoft in “Don’t ship the org chart” https://hardcoresoftware.learningbyshipping.com/p/047-dont-s...


I'm pretty skeptical of what Sinofsky says. Back in the day, former Microsoft employees called out the differences between what he said and what he did, i.e. "Don't ship the org chart" -> his own org ships the org chart.

https://news.ycombinator.com/item?id=4778996

https://news.ycombinator.com/item?id=4776031


One of the hilarious exchanges (pun intended) that happened on Sinofsky's blog concerned MS Exchange. The executive-level view of events is vastly different from the actual events.

tl;dr:

Stevesi (Technical Assistant to BillG): Mgmt forced Exchange to use NT Directory (followed by glowing description of the NT directory)

DonH (Exchange Directory dev lead/ later Active Directory dev lead): No, NT was late, and eventually canceled NT Directory. Exchange wrote and shipped our own Directory and then moved the code to NT to use as the base for Active Directory.

Stevesi prevaricating about high-level executive view of the interaction of NT vs Exchange directory.

DonH: No, that's wrong. NT provided nothing. Exchange created an email-specific directory. I used that to make Active Directory. Water flowed uphill not downhill.

from https://hardcoresoftware.learningbyshipping.com/p/021-expand...:

"That proved to be a defining moment because deploying a directory was hugely complex and there was no way EMS could do it twice. In one of the rare times an architectural choice was pushed to a team, using the directory from NT became a requirement for EMS. Many others supported this, including the Server leadership. It was to them as natural as pushing Excel to use Windows—the directory was that core to NT Server—while sharing files and printers was the baseline scenario, it was the directory that brought deep enterprise value to customers. For the better part of the following year or more, EMS would not speak well of using the NT Directory, and conversely the NT team felt that EMS was trying to use the Directory in ways it was not designed to be used. This sounded to me a lot like getting Excel to work on Windows, and it played out exactly that way. Had EMS not used NT Directory, it is likely Directory never would have achieved critical mass as the defining app for the client-server era (and remained the cloud anchor for today’s Office 365). And conversely, had the NT team not met the needs of EMS, then the NT Directory would have likely been sidelined by the rising importance of the email directory in EMS. Forcing this issue, while it might be an exception, only proved the strength of a strategic bet when it is made and executed. Still, it was painful."

Comment from DonH Apr 22, 2021, at end of blog entry:

"Speaking as the dev lead for the Exchange Directory (1991-1996) and later on Active Directory (1996-2005), there's a lot wrong with this chapter. NT's approach to functional directory services in the early 90's was "wait for Cairo. they're building one", which meant that we in Exchange had to build our own directory service. When Cairo collapsed (late 1995) Exchange and NT struck a deal so that once Exchange 4.0 shipped (April 1996) one of my developers and I brought a copy of the Exchange Directory source code over to Windows, and we built Active Directory out of that. Exchange in no way "bet on" the NT Directory; we essentially built the replacement for it in order to get the features we needed. Ask me if you need details.

However, the part about endless repeated pressure to build everything (specifically including the directory) on top of SQL is entirely accurate.

I'm only moderately annoyed that I had to pay ten bucks to post this correction."

Second comment from DonH:

"You're missing the point that there was no NT Directory. The strategy given to us was "use the NT directory, which is the Cairo directory. Sorry that doesn't exist yet, so Exchange might need to cobble something together for its first release." I built that something, and later went on to use it to fill the directory service shaped hole in Windows.

Presenting this as Exchange leveraging the NT Directory might be polite, but it is definitely not accurate.

And although I remain eternally grateful to LDAP for saving me from COM I completely agree about omitting it from the history."


Is this a response to UI = f(statesⁿ)[1] or are they both a response to something else?

1: https://daverupert.com/2024/02/ui-states/


Domino Pizza tracker

The article is centered around the pizza tracker, but I thought that tracker was fake.

Just for illustrative purposes.

Is it not?

https://www.the-sun.com/money/6927297/dominos-pizza-tracker-...


My former boss wrote the first version of it when he was working for Crispin-Porter + Bogusky. He claimed that internally Domino's already had the infrastructure for their own telemetry and logistics and putting a python API on top of it to connect to the web was a no-brainer. To hear him tell it, it was his idea. Of course I later worked for a company that was helmed by the former Domino's CEO at that time who claimed it was his idea. Based on the technical backstory I would believe my former boss over the CEO. Of course the existing telemetry at Domino's could be garbage or fake...


That article is basically "they say, they say" argument between two Twitter users, not sure one could come to any conclusion based on that. It's also a The Sun article, FWIW.

I've always thought the tracker was real, at least here in Spain (it looks different than the US one in the article pictures), as it always seems to have changed at different intervals. Could also be just randomized a bit I guess. Long time ago I ordered from Domino anyways.


I'm sure some MBA douche presented the calculation at headquarters to save the money by making it fake, but when it was first released there was a massive campaign showing Domino's workers getting the orders printed out and hitting buttons at various parts of the process that actually did update the tracker.


I'm not sure about Dominos, but the last time I ordered Papa Johns, the pizza tracker had a disclaimer that said it was "for entertainment purposes only, and does not reflect actual events"


> includes the name of the employee baking each pizza

Yikes, I wouldn't put that info online. What's the point?


The Only Unbreakable Law - Conway's Law.

https://www.youtube.com/watch?v=5IUj1EZwpJY


Seeing all the mentions of Conway's Law here jogged something in my mind -- could one assert the converse and "reverse engineer" the org structure of a company by examining the systems it has designed?

This could be super useful when considering the next place of employment for example; the organizational dysfunctions and idiosyncrasies only become apparent once you've jumped in with both feet.


Legend has it that Google produces a new chat program every year or two when a high-up manager wants a promotion.

Microsoft also seems to have generational UI revamps. There's a funny bathtub curve where Win32 has outlived many of its replacements.


Yes, you can! You can even tell when the organization was changed, since many complex systems are upgraded over time. The example I like best is the Windows volume control(s), as pointed out by Casey Muratori (https://www.youtube.com/watch?v=5IUj1EZwpJY).


I have noticed that the Panera Bread status tracker used to provide good information, but doesn't anymore. It frequently says an order is done while it is still being worked on.

Might be a UI/Organization mismatch, as described in this article.

Or maybe it's the staff intentionally marking things done early, to game metrics expectations from management.


Umm, the argument in the article seems self-defeating. UI=f(org) except we end up with the same UI with radically different orgs because we can just f() over the differences with dark design patterns and users can't tell the difference.

I can show you anything in a UI - only good orgs can develop valuable products from those UIs.


I order from Domino's now and then. I've noticed that someone named Scott seems to be working preparing pizzas at all hours on all days. It didn't take me very long to figure out that the tracker was just putting in a placeholder event. The actual delivery tracker seems to work OK, though.


The Domino's pizza tracking bits is a funny one for those whom had read Snowcrash, where a whole earlier section of the book is about the advancement of the pizza delivery industry.

I don't know, it seem to me the 90s had a very dystopian view of future Pizza Hut.

« The Deliverator stands tall, your pie in thirty minutes or you can have it free, shoot the driver, take his car, file a class-action suit. The Deliverator has been working this job for six months, a rich and lengthy tenure by his standards, and has never delivered a pizza in more than twenty-one minutes. [..] Pizza delivery is a major industry. A managed industry. People went to CosaNostra Pizza University four years just to learn it. Came in its doors unable to write an English sentence, from Abkhazia, Rwanda, Guanajuato, South Jersey, and came out knowing more about pizza than a Bedouin knows about sand. And they had studied this problem. Graphed the frequency of doorway delivery-time disputes. Wired the early Deliverators to record, then analyze, the debating tactics, the voice-stress histograms, the distinctive grammatical structures employed by white middle-class Type A Burbclave occupants who against all logic had decided that this was the place to take their personal Custerian stand against all that was stale and deadening in their lives: they were going to lie, or delude themselves, about the time of their phone call and get themselves a free pizza; no, they deserved a free pizza along with their life, liberty, and pursuit of whatever, it was fucking inalienable. »


The whole post is based on a questionable assumption that Domino Pizza Tracker accurately reflects the status of the Pizza, when it could just be a dumb timer based on a statistical average. Sure, the Pizza Delivery person in the last step has to be accurate, but that's simpler than tracking if the Pizza is in the oven. As for the point itself, I think the real-time status tracking is a very, very small subset of UIs. Yes, it's difficult to deliver if the organization isn't designed around this, but most sucky UIs aren't limited by not having the data.


> it could just be a dumb timer based on a statistical average

The post makes this point explicitly. It doesn’t sound like you read the whole thing.


My bad! I raced through the article, skipping the important bit.


Its not perfect, but at Starbucks, the “working on your drink” notification comes when they rip it off the machine, so it is kind of real time. (Kind of, because often they rip off 3 or 4 and stick them to the wall)


So the output of an organisation is a function of the organisation. Or the amount of effort you put into something affects the outcome. Shocking.


“To the extent that the business takes place in software, designing the software is designing the business.” @mulegirl from Twitter


see also https://en.wikipedia.org/wiki/Conway%27s_law

and the fact that the same tropes are recapitulated over and over again without any reference to prior art. Alan Kay is correct; computing is a pop culture.


Reminds me of the Free Energy principle a bit: https://en.wikipedia.org/wiki/Free_energy_principle


> where the step from “order received” to “pizza in the oven” happens only because of a timer in the UI

Then don't lie. Instead of the status, just say outright that it takes up to 5 minutes for the pizza to get in the oven. And show a timer.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: