Hacker News new | past | comments | ask | show | jobs | submit | bthecohen's comments login

Hi, one of the authors here! Thanks for your question – you raise a good point about inference and model size.

We touch a bit on potential applications in section 8 of the report ("Future directions"). One of our main focuses is on autonomous troubleshooting agents that can query and reason about metrics. In this context, the size and cost of Toto is considerably smaller than that of LLMs already in use for similar workflows. In future work, we envision using Toto in a multimodal context where expect the time series backbone to represent a manageable proportion of the overall compute budget.

We are also exploring how to use Toto to improve our anomaly detection and proactive alerting solutions at scale, and you're right that more work remains there in terms of inference efficiency.


This is a very useful guide. I'd add one additional step that can sometimes be useful if you're doing a more complex change than adding a field to an object. For example, sometimes you may want to read from an entirely new schema or data source, while ensuring that the user-facing behavior stays the same.

In that case, you'd want to add a step between steps 2 and 3 where you double-read, either inline or within a shadow workload, and add telemetry to alert you if there are discrepancies between the two versions.

Only when you're confident that the two data models are producing acceptably equivalent results would you cut over the primary reads to the new model.

This pattern often comes to play when you break off a piece of state from a monolithic database into a dedicated service.


This is actually referring to different species of grapes, and has nothing to do with winemaking technique or prowess.

Virtually every type of "quality wine" that is consumed in the world comes from the old world species vitis vinifera. This includes basically every variety you've heard of: Chardonnay, Cabernet Sauvignon, Pinot Noir, Syrah, Merlot, just to name a few French varieties (but the same applies to Italian, Spanish, German, etc. wines). Any American winemakers making those kinds of wines (and I agree that many New World wines go toe-to-toe with the best France has to offer!) are using grape vines that were originally imported from Europe.

In addition, there are several species of grapes that are native to North America, the best known of which is probably the Concord grape. Unfortunately, due to their flavor profiles these species don't tend to be used for winemaking (one exception being Manischewitz and other sweet ritual wines). However, they have a natural resistance to phylloxera, which saved vitis vinifera from decimation and likely extinction through the grafting technique.


I did not know this.

However I would still like to exchange bottles with other fellow =)


This is funny, but I think it misstates the real problem. This is what would really happen:

King: "Make me a toaster."

Software developer: makes toaster

King: "Okay, that's great, but could you also make it scramble eggs? It should just be a small change, right?"

Software developer: adds egg scrambling module

King: "Actually, could the eggs be soft boiled instead?"

etc.

In other words, OO and other programming abstractions arose precisely because we actually do need to deal with changing requirements, all the time.


>"Actually, could . . . instead?"

A phrase that so very often tests the self control, patience and will to live of software developers around the world.


Eh.

My answer to that question is usually. "Yes, as long as you're willing to pay for it."

I'm not a consultant. "Payment" comes in many flavors.


> "Actually, could . . . also?"

Is the bigger problem. When I hear "instead", I think, "oh good, we can delete a bunch of stuff we don't need anymore", but when I hear "also", I think, "oh crap, not only do we need to add stuff, we need to make the existing stuff more flexible".


I enjoyed reading it - it's amusing and some of it resonates with my experiences.

...But I don't think it's great satire, for the reasons you point out.


This happens all the time in all engineering projects. Generally, if the requirements change that drastically in the middle of the project then either the initial requirements were terrible or the requested change is just a bad idea.

Many an engineering project has been sabotaged because someone tried to force a bad change into the middle. I could go on all day just with examples from the Pentagon.


But the EE can start over and still finish earlier - and deliver toast in the interim.


We just adopted Angular for a new project at my job about a month ago. And while I understand the reasoning behind all the changes, Google does come off as quite aloof with respect to the business implications this will have for real users.


This is a head-scratcher. I have two ideas about why Apple may have done this:

1. They are planning on releasing their own calculator notification center widget. 2. This is the hypothesis put forward in the article, that Apple conceives of notification widgets as essentially read-only sources of at-a-glance information.

Either way, it's clearly a case of the right hand not knowing what the left...etc.


Or a case of early validation for speedy AppShop deployment (which in 99.999% of the cases doesn't pose any further question), followed by human control and judgement (which takes a little more time).

I would rather see that as a good thing.


I always say that what our industry needs is more incoherent misogyny.


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: