Hacker News new | past | comments | ask | show | jobs | submit | jarjoura's comments login

I imagine it's mostly going to be folks who were planning to leave anyway, and this is the nudge they needed to do it sooner.

The downside to this approach is that they will probably tilt more towards senior and staff engineers who have been driving important projects and likely were going to leave once the project ships (or cancels).

Now, they leave 6 months earlier, and leave teams full of new or junior level employees without much context. The company is full of smart folks though and they will recover. It will just be a painful year as teams scramble to figure everything out.

It's also a potential F-U to Meta's approach who just did broadcasted performance based layoffs. Future employees will keep note and it will make it harder for Meta to recruit.


This kind of nonsense was definitely a factor in cancelling my upcoming Meta interviews.

I'd like to think that if I (or anyone) was not performing up to par, our managers would TELL US instead of the CEO deciding to do a layoff and character assassination. One of these is productive, the other one puts on a show for Wall Street at the expense of your employees.

It didn't help that Meta demanded that I re-interview for the same level that I interviewed for and they offered me two years ago, either.


I've been fortunate that my manager previously told me "if you're surprised at the performance review, I've failed at my job". That feels like the right approach - everyone should know their situation as soon as something starts going wrong, not suddenly at company-level layoffs.


Seems like it's fair to assume, both canines and humans are highly adaptable, and we behave the way we need to in the environment we're in. My dog has never shown me anything except love and compassion, and that gets him the best living conditions, back scratches and the tastiest food.


> Seems like it's fair to assume, both canines and humans are highly adaptable, and we behave the way we need to in the environment we're in.

If that were as true as you seem to be suggesting, there would be no difference between dogs and wolves when the latter are kept as pets, which is clearly false.


I constantly run into situations where I need to nest an iterator computation, and things like "it" get confusing.

I'm all for adding language features to avoid boilerplate, and it's clearly useful. I just want to call out that anonymous typing can be polarizing in large codebases and maybe only use it sparingly.


I doubt I'll use it in committed code very often at all, this feels more like something that's very useful when hacking at something in the REPL


nested_example = [1, 2, 3, 4, 5].map do

  [it, (1..it).map { it * it }]
end

Has an ambiguity, so you just add |x| to one of them..

nested_example = [1, 2, 3, 4, 5].map do |x|

  [x, (1..x).map { x * it }]
end

Seems like a mental over complication of a non-issue to me.


Implicit `it` shadows in other languages like Kotlin just like any other variable scope, and I really can’t say I’ve ever had it be a problem (implicit receivers, on the other hand, can be a right pain in the ass in poorly designed libraries and can I just say Gradle can go right to hell for it).


It’s a nonissue in small projects, but I’m not sure I agree that `{ x * it }` is easy to reason about when coming back to this block in 6 months. It’s mostly something that I’ve seen bite engineers during refactoring.


Being ruby you can monkey patch block to add an additional t to every nested block’s `it`

(This is terrible please don’t)


I can't tell if this article is trying to make it seem like a negative or not.

Every decade defines is own unique style and it spreads until it's no longer distinct.

3rd wave coffee shops all look like that because they started during peak farm to table aesthetic. It's funny because it was born out of the answer to 1990s strip-mall sameness where you'd find Chilis and Starbucks. I remember designers losing their shit when they found wood planks of fallen down barns they could use in their interiors.

Now that people are getting bored of farm to table, it'll be on to something else. When we look back in 50 years, all of this "sameness" the author is pointing towards will mostly just be here to define 2010-2020, with whatever name historians decide to call it.


There is a trend towards less colors overall. Something is changing

https://lab.sciencemuseum.org.uk/colour-shape-using-computer...


Maybe we are overstimulated by visual content and attracted to otherwise less visually striking objects, either to find some peace or to not be distracted from that content?

At least for me, visually minimalist objects and environments are appealing because I generally feel overstimulated in modern society and by modern technology. I might feel different if I were to spend my days on the field doing slow, repetitive hard work.


I don't really buy this line of reasoning. Nature can be immensely colorful yet soothing to all the senses. You don't need to stuff your house with psychedelic rugs, tapestries and color-changing LEDs for it to have color. Including color does not mean that your space is cluttered or maximalist in style.

We got a designer for a flat that was being renovated, and I realized that I needed to seriously involve myself in the project if the result was not going to turn out a lifeless, gray-white-beige soup.

Now the flat is full of green, (warm) wooden, peach and red elements, giving it a cozy autumn-y vibe. And I don't need to stuff it with props that "pop" to add personality - it has one of its own.

The prevailing wisdom seems to be "Make it all greige, and introduce life with (often pointless) colorful props". I feel this is more offending to the senses than having a tastefully colored environment where you don't need to "add" anything.


In some markets at least, like cars, this blandness is a sales optimization. Strong distinct colors are more likely to block a sale than bland generic ones.

It may not be exactly the same in other markets, but there's still a cost to offering additional color variants. It's a cost optimization to offer the fewest colors that are appealing, or at least aren't objectionable, to the widest range of potential buyers.


I suspect that's the demographic shift towards older average population.

I've only got anecdotal examples of old people preferring blander colours, hence "suspect" rather than "think".

When my dad died, my mum replaced all the furniture and carpets with blander versions of the same — in particular replacing the bold orange living room carpet and dark green patterned hall, stairs, and landing carpets with mild off-white.


> I suspect that's the demographic shift towards older average population.

I agree. The downside is that this is a sort of permanent\long-term shift since the world is generally getting older and we are reproducing at below-replacement.


Millenial Gray


1. It's cheaper.

2. Design values and aesthetics are ossifying around certain "hard" rules.


> can't tell if this article is trying to make it seem like a negative or not

It left no ambiguity for me. Both the project (which is great art in itself) and the commentary opens a door for a hard truth, and new questions emerge,

In tech, might we learn from this?

from TFA: "Looking for freedom, we found slavery.”

Every parent. Every patrician, condescending patron. Every "elite". Each have their idea of what "freedom" is best for others.

And the people say: "Boaty McBoatface!"

In cybersecurity we want people to be safe. But the people say; "Give me TikTok, Microsoft...."

Would Rousseau want us "being forced to be free"?


> Would Rousseau want us "being forced to be free"?

Probably. The French revolution actors were inspired by his work, and the French revolution led to the declaration of the human rights, which states that all people are born free and equal. If you take this claim as universal, then your moral duty is to make it happen.

So yes, do free people even if they believe their situation is fine. Perhaps for instance because they've been brainwashed into believing it is, or perhaps they don't even know what freedom is.

Unfortunately countries/organizations/persons that claimed to do that too often also had ulterior motives.

The phrase "looking for freedom, we found slavery" is interesting because if there's slavery, then there's a master. The master we are talking about is "norm" or "convention". The term "slavery" can be apt in some cases, but generally is excessive.

"Anticonformity" was somewhat a dismissive word in my country up to some point, but I have not heard it in that way for while, which suggests that people accept more easily "abnormal" people. When you want your freedom and you can have it, then there's no slavery nor tyranny.


You've inspired me today.

It's odd that in almost all walks of life, freedom comes from non-conformity. In arts, and in science, "progress" is a movement away from wherever we are. We may not like non-conformists, but we absolutely rely on them to move forward.

Yet totalitarian technological social systems are the only creation that make those who conform "more free". By design they limit options at the margins while enabling those in the middle of the mediocrity curve. Slavery is freedom. And thus, since time stops for no man or machine, systems invite their own inevitable destruction.

Technology is not just "a way of not having to experience the world" (Max Frisch), but a way of not having to experience change. And I think the "master" that you speak of is fear (fear of things changing).


2010s visual art was full of super saturation and hyperrealism due to Marvel and Instagram filters.

Sony's "Into the Spiderverse" freed itself from the shadow of Pixar, and has forever changed what high quality, mass audience, computer animated films can be, and Dreamworks has already shifted its own design towards more expressive unique cartoon language.

Also, the rising shift of A24's unique visual aesthetic in the 2010s remained quite obscure until "Everything Everywhere All at Once" 2 years ago finally brought its "freedom" of visual expression to mainstream audiences. Now, others want to capture that same visual style and I'm seeing a broad shift away from over saturation, hyper realism, and toward the more muted and mundane. I'm impressed that Marvel is already applying some of it in "Agatha All Along".

Anyway, my point still stands, artists will constantly push back against whatever is mainstream in an effort to be seen and to define their own aesthetic/brand. It's easier to be 1 of 10, than it is 1 of a million. Plus, if they're lucky enough, they get to be known for starting or accelerating the movement.

Every single decade has very clearly defined aesthetics that designers copy until something new comes along to disrupt it.


For the very same reasons you point out, there is also a style shared by the servers who work at these coffee shops. We would expect young people from the same generation to adopt similar hairstyles, facial hair choices etc.

When we look at 70s interiors full of leather upholstery, wood panels and with a desaturated look from old film camera pictures, they feel outdated. It's hard for me to imagine looking at these white, clean HD interiors and saying "oh this feels so outdated" - but I bet that's exactly what people will think decades from now.


What looks modern is as arbitrary as what looks futuristic


Exactly. This is how you can place in time things like TVA in Loki and the office in Severance even though they do not belong to that time canonically.


I was hoping this article would cover non-obvious date time quirks when conveying this to humans.

If the context is an event, what does midnight mean? A one-day event might start at 6pm but not end until 2 or 3am. A computer considers that a two-day event, but that's kind of intuitively wrong. To a human, it's only a 1 day, and it doesn't end tomorrow. It ends, late tonight.

What if it starts Saturday at 9pm and goes until 6am? You would still consider that a Saturday event, but ends at sunrise, or early morning.


The problem with 2-4 hour boardgames that require extension rulebooks, launched in the last decade, is that these games are islands. No game can point to another game to offer mental shortcuts in absorbing similar gameplay.


But they absolutely can. Go see Friedemann Freese Copycat, which has zero unique rules: Everything is stolen from a popular game insiders know. Hell, it even has a misprint on purpose, matching a famous misprint! You can teach it in 5 minutes to the right people, just by reference.

The modern, kickstarter heavy 3 hour monstrosity just can't assume that the buyer has played all the games that have the same mechanics they are basically lifting from elsewhere, while only explaining the 2 or 3 places where they are doing anything interesting. But when you go through lives rules explanations among people in industry, half of the rules are really handled by reference, because you know what is going on. With some designers, the rules are almost unnecessary, as the played aids and the graphic design do 90% of the work. I've played that game with a certain designer: He sat there to answer questions, but he didn't even hand us a rulebook, or provide an explanation. Just the components in front of us and 'figure it out', as an experiment on the game's learnability


Best Twitter, was when it was just tweets of "eating cereal for breakfast" and the pull-to-refresh-ification of everything.


This is the right answer, sadly. I had moved over to Mastodon and already people are using it to raise pitchforks. I just don't think our human brains deserve the power that comes with social media.


Andreessen and Co. have always been part of this strange utopian Libertarianism where they believe with all their hearts, that tech sets you free. I say the word strange, because it's optimistic, and forward looking, and not cynical. Peter Thiel used to be the main "villain" while the others kept to themselves. Seems that's not the case anymore, and they are more comfortable speaking their minds.

Realistically though, as an outsider, I think X.com has just become their echo chamber and they've all lost the plot.


Clojure is a quirky language, and I really enjoyed writing a proof-of-concept microservice with it back in 2015-era when everyone was shouting "Scala is the way!" I was able to prototype with it and stand it up in a weekend. With the tiniest bit of code, I had the exact service I needed. It ended up in production after only spending a couple weeks, most of which was spent wrapping my head around Docker and Mesos that we used to run the .jar.

However, it's a quirky language. So, my quick take, as a glue layer on top of the JVM, it was quite powerful, but jank has me scratching my head. LISP doesn't really read well the bigger the codebase, and as something to write software in standalone environment, it makes me a bit hesitant.

I sometimes would hit walls, because in real world software, you need persistent state. Functional software, for obvious reasons, fights against that, and so modeling state is actually quite difficult. This is where I think, as a small layer on top, it's fast and effective. I would just not want to write more than a few files with it though. Happy to follow along this project though and see where it goes.


Often times, when people learn Clojure, they are also learning functional programming and lisp and data-oriented programming and interactive programming all at once. Writing Clojure requires a shift in perspective, coming from the imperative and object-oriented paradigms, and without that shift, even basic programs can seem impossible to build. This shift was very difficult for me, coming from C/C++/C#/Rust, and I've spoken with many others who've felt the same. I suspect, when you talk about hitting walls due to state, it's due to that perspective shift not being complete.

Does that mean I'm saying Clojure is the best lang for everyone and it's their fault if they don't get it yet? No, certainly not. We need to do better with that, in the Clojure world, to make that bridge easier to cross. But, having fought my way across that bridge, I can confidently say that mutable state is no problem in Clojure. We have first-class support for it and that support is thread-safe by default. Unlike something like Haskell, effects aren't scary and don't need to be wrapped. They can be adhoc.

jank inherits all of this and just brings it to the native world. That means lighter binaries, faster startup, and easier interop with native libs. Aside from that, it's Clojure.


> I sometimes would hit walls, because in real world software, you need persistent state. Functional software, for obvious reasons, fights against that, and so modeling state is actually quite difficult.

I haven't felt this, Clojure has good primitives for dealing with state, both temporary and more permanent. The biggest difference from (most) mainstream languages is that it's very explicit what has state and where you're mutating it. But personally I feel like that makes it easier to manage complicated state relationships, rather than harder, since it's typically isolated in smaller parts.


Clojure offers so many ways for you to manage state for different use cases. Atoms, volatiles, agents, vars, refs.

Functional programming never fights against persistent state. It simply carves out the functionality by use cases and offer you more choice in managing it.


    LISP doesn't really read well the bigger the codebase
[citation needed]


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: