Hacker News new | past | comments | ask | show | jobs | submit | arthurdenture's comments login

> Something that surprised me is that [1] is of a minor key, but the solfege marks are for the relative major

Major and minor keys use the same shapes / solfege system. The major scale is "fa, so, la, fa, so, la, mi, fa", while the minor scale uses the same syllables but starts on la. This means that the shapes always represent the same intervals (i.e. pattern of whole and half steps) whether a song is in a major or minor. It's hard to describe but makes sense when singing.


I'm aware that they are the same intervals; perhaps my brain is destroyed by my music theory classes, but I really think of a minor scale as a different beast from a major scale starting on the submediant. [1] indicates that other movable do systems may use "la" based minors, and that it is "sometimes preferred in choral music, especially with children," so I suspect this distinction is something I learned.

1: https://en.wikipedia.org/wiki/Solf%C3%A8ge#Minor


One limitation is that the shapes are really only useful for singing, not for instrumental music. They also would not be meaningful for music that switches keys or that isn't tonal (i.e. isn't in a particular key at all). So they are a very good fit for what you might think of as "church music" (at least written before the 20th century), but for other settings they would be either meaningless or a hindrance.

That's one reason shape notation never caught on; another reason is that the notation was introduced by publishers of American sacred music and not by, say, music conservatories.


Is it possible that many of those uses predate the current zoning? There are numerous buildings in Chicago that would not be permitted by their current zoning.


I just looked it up on Chicagos map and no, at least the half dozen things I checked were all properly zoned.


I asked imagen 2 to generate a transparent product icon image, and it generated an actual grey and white square pattern as the background of the image... https://imgur.com/a/KA2yWHp


That's because it was trained on RGB images without an alpha channel. There is currently no public image generator that understands alpha channel.


As a user, this really frustrates me. Promoting is not precise enough to compose a bunch of specific elements, so the obvious solution is to do several prompts each with transparency and then combine in Photoshop/photopea. I end up asking for a white background and then cutting out manually


I feel like someone could satisfy this issue with a little background removal AI in the pipeline. I also go through the same process, stitching together a few tools, and obviously it's possible... but it sure would be nice if it all fit together better. Something where "transparent background" was translated to "white background" or something and then it went through the background removal.


The closest I've found is vector generative AI like what's in Adobe Illustrator today.


Like the other commenter said, these models aren't trained against images with an alpha channel. Given the same sized model that'd make typical results worse to benefit a niche case. You should be able to have them generate this style image on a background you can color key out though.


Those examples look nice and would be trivial to automatically cut out/trace into transparent vector with inkscape


Thankfully, MacOS and iOS have a fantastic ML powered "extract the image content in to a new image with transparent background" function that you could use on this silly output to get what you want.


Luckily there is another AI for removing the background (:


Extremely useful .gitconfig snippet:

  [alias]
   git = !sl
catches e.g. `git git status`, which I run shockingly often.


That was the case many years ago (extremely customized perforce server), but after that there was a full rewrite as a distributed system built on Bigtable etc. - https://cacm.acm.org/magazines/2016/7/204032-why-google-stor... is an overview. It wasn't a ship-of-Theseus type situation where they incrementally replaced components.


You're comparing regional rail terminal stations, which serve many lower-frequency lines, to subway platforms, which typically serve a single higher-frequency line. (2-10 minutes between trains.) That's the scenario where a mezzanine is less useful.


I completely agree with the more general point, that consuming external data requires a validation layer. But oh boy do I have feelings about class-validator.

Here's what it looks like to correctly annotate an array of objects with with class-validator and json-schema annotations:

  @ApiProperty({ type: Foo, isArray: true })
  @ValidateNested({ each: true })
  @Type(() => Foo)
  @IsArray()
  items: Foo[];
It's not just that you have to define everything in triplicate, it's that the failure mode for forgetting any of the above is to silently not validate your data. Unless you're very careful, you don't get the safety benefits that were the whole point of using class-validator in the first place.

If I were starting from scratch, I would instead consider either io-ts or a solution that involves a code generation step, where this entire category of risk is avoided.


The alternative would be to write a side effect inside your `render()` function, which is illegal -- it breaks the new Concurrent mode rendering, which can call render() speculatively.

I mean, I see why it's intuitively unappealing to you, but there are perfectly good reasons for the design.


There are "perfectly good reasons" in the sense that if you artificially limit yourself to using the React component tree to manage routes, yes it makes sense how they arrived at this solution. But after dealing with it for a while, it's just so obviously not the correct approach, and it's crazy to me that it's the defacto router. Cargo culting at its finest. The router could exist outside of or at a higher level than the component tree.


It's the defacto router because it has gone through many iterations, has lots of users, is easy to bring into a React app, and doesn't fight w/ React, it works with it. Plus there are a variety of approaches to declaring routes. [0]

I'm not sure I see how obvious it is that a <Redirect/> is wrong. But that's fine, I take your point that it's not particularly intuitive.

I'm of the opinion that routing, in general, is a function of application state - and I like to manage my application state with Redux - so I will often also mix in connected-react-router [1]. This lets you do navigation w/ an imperative API [2].

[0] https://github.com/ReactTraining/react-router/tree/master/pa...

[1] https://github.com/supasate/connected-react-router

[2] https://github.com/supasate/connected-react-router/blob/mast...


The alternative would be to write a side effect inside your `render()` function, which is illegal

That's one alternative. Another is not to try implementing behaviour that has nothing to do with rendering using a rendering library in the first place.

A horrible amount of accidental complexity has been created in the React ecosystem when people have tried to use it like a full framework. If all you have is a hammer, maybe it's time to consider using other tools as well.


Interactions can be extremely negative without making the news.


and positive


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: