Hacker News new | past | comments | ask | show | jobs | submit login
Val, a high-level systems programming language (val-lang.dev)
423 points by cpeterso on July 18, 2023 | hide | past | favorite | 277 comments



I might actually be more excited about Val as a C++ successor than I am about Herb Sutter's excellent CppFront... Admittedly, all I know about it is from the two presentations below. But, from what I can tell...

Statically compiled. Statically typed. Interops with C++. Memory safe. Typesafe. Data-race-free.

The elevator pitch I give when describing it goes like this: Imagine you were starting a new C++ project and didn't really care about performance (spoiler: perf comes back in the end). So, you decide to not bother using pointers or references anywhere. You just pass by value, return by value everywhere, all day long. If you ignored the obvious perf problems of passing around maps of vectors of objects by value, wouldn't it be nice? you don't have to worry about side effects or data races or anything. And yet, the data is not immutable. You can go ahead and mutate it all you want worry-free because the data is all completely local to your function.

Well, turns out the Val folks have figured out that by eliminating pointers and references from the language, the can get the compiler to automatically pass-by-const-reference and return-value-optimization under the hood such that it preserves both the performance and the semantics that you want at the same time.

Val: A Safe Language to Interoperate with C++ - Dimitri Racordon - CppCon 2022 https://www.youtube.com/watch?v=ws-Z8xKbP4w

https://cppcast.com/val-and-mutable-value-semantics/


Listen to the last two episodes of ADSP Podcast with Sean Parent, regarding Val.

Adobe Research labs is seriously sponsoring its development.

https://adspthepodcast.com/


> Interops with C++

Doubt. I simply don't believe any other language out there is capable of interoperating with C++. Even C++ compilers have broken binary compatibility, different versions of the same compiler even.


Clasp might be such a language, it seems.

https://github.com/clasp-developers/clasp


Sounds like we're trusting a lot to the compiler. Will this lead to me making a small change which breaks an optimization in some distant code and ruins performance?


Maybe we should tell the compiler somehow that we don't want to pass a value, but a reference. For example we could use a * or a & symbol.


Wait for the blog post on how beautiful Val++ is.


Val<<>> when they inevitably introduce templates and stringstreams


In 2028, when we revive the One True Programming Paradigm (for Enterprise Software)™, we'll have bestowed upon us the glorious Objective-Val


In 2028, we will still be writing code in C and Fortran. And there will be all these new languages X++ that would be promising to replace either or both ;) ;) ;)


X++ already exists, better pick another name.

EDIT: https://learn.microsoft.com/en-us/dynamicsax-2012/developer/...


Swift++, or Swifttt.


Taylor Swift’s subject-oriented programming language – Swifties.

It would have: • garbage collection (Shake It Off) • error correction (Bad Blood) • closures (Closure) • optionals (Would’ve, Could’ve, Should’ve) • automatic reference counting (Right Where You Left Me) • REPL (I Knew You Were Trouble) • variable mutability (Everything Has Changed and Evermore) • guard condition (Eyes Open) • strict typing (You Belong With Me)

After compiling run the binary by issuing the command: Run {filename}

To debug: Tell Me Why {filename}


I was almost murdered a few years ago when I told a classroom full of senior girls at a suburban high school that Taylor Swift was overrated (I just wanted to see what would happen if I poked the bear).

That is all I know about Taylor Swift. But it seems like you are onto something here. :D


You were lucky to make it out alive.


bahahaha this is such a nichey yet epic comment


Petition to change all languages with “+” to repeat the last character for each plus. Going down the list of languages on Wikipedia[0]

A+ becomes AA C++ = CCC Clik++ never mind this was a bad call. JJJ RRR Visual JJJ XXX (nice) xBaseee ZZZ

[0] https://en.wikipedia.org/wiki/List_of_programming_languages


X+++


With AbstractSingletonProxyFactoryBean and family present as core language features.


Might have a future in mobile.


sounds good, but ill wait for c-interop so my moneys on valjective-c


Better write it like `Val< <> >` otherwise the parser will get confused.


I believe rust is sort of similar in that apart from the references, it's very heavy on the move semantics. And IIRC, sometimes it's doing extra memcpys. But I don't know if Rust is explicitly targeting const reference passing.

That said, if you're all in on it I imagine that the front-end could pretty aggressively target that, much like how rust uses no alias.


By default, non-references in Rust are passed by move, although you can derive `Clone` on types composed of all `Clone` types and then explicitly call the `.clone()` method to copy things. The only types that will be copied by default are the ones that implement `Copy` in addition to `Clone`, which in the standard library is on primitives like integers and characters but overall is used fairly sparingly.

References need to be specified as mutable if mutation is needed (i.e. `&mut T` instead of just `&T`). Using immutable references is also encouraged from the borrow checking rules; having a second reference (either mutable or immutable) alive at the same time as a mutable one is a compiler error, but using multiple immutable references at the same time is fine. (Pointers are the same way, but not really used much outside of bridging with unsafe Rust, since they can be null and therefore can't be dereferenced outside of `unsafe` blocks).


I think OP was referring to the fact that implementation of moves can sometime (more often than desired I would add) do memcpys. e.g. if you have a struct of ~100 bytes and you move it, it will probably emit a memcpy to write it in the receiving function's stack. AFAIK this happens because addresses are kinda considered observable, and not doing these copies could thus be observable and potentially break some weird code. I don't think Val would have the exact same problem because it doesn't have the concept of addresses/pointers.


> AFAIK this happens because addresses are kinda considered observable, and not doing these copies could thus be observable and potentially break some weird code. I don't think Val would have the exact same problem because it doesn't have the concept of addresses/pointers.

I honestly don't understand what you've said here at all. I'm not familiar with what "observable" means in this context, and googling "observable aliases programming languages" didn't help clarify it (two results were articles about C/C++ aliasing, one of which used the term "observable" once but didn't seem to define it at all, and the other which was the Wikipedia article on "Aliasing (computing)" that didn't contain the term at all).

Even if I knew what "observable" meant, I think you might be missing a negation (or maybe including one you didn't mean to) somewhere; if addresses are "kinda considered observable", it's not obvious why it would be an issue to omit copies and "be observable". Maybe understanding what's meant by "observable" would shed some light on what you mean here, but naively, I don't understand how code that expects things to "kind of" hold a property would break if that property were somehow more strongly held.


> addresses are kinda considered observable, and not doing these copies could thus be observable and potentially break some weird code

This is certainly the case in c++: distinct objects have distinct addresses, so to remove a copy the compiler has to prove that the aliasing is not observable which is hard.

But does rust have the same guarantee?


> You just pass by value, return by value everywhere, all day long.

I cannot efficiently mutate or even store references in structs anymore. That seems to be a show-stopper if you want to have any non-trivial or non-standard data structures in the program. Granted, they are hard to get right, so maybe it's a feature.


I just learned about Val today, but Rust has similar constraints. To write to a particular variable requires having ownership or a mutable reference, which only one part of the program can have at once. Since you can't have two mutable references in different parts of the program or in different threads, you need some sort of atomic encapsulating data structure to mediate access. Something like a mutex.

Newbies to Rust have a lot of trouble learning how to get the borrow-checker to accept their code, but what these restrictions are doing is syntactically ensuring there are no deadlocks or data races. It is presumably a similar story with Val.

If Val actually handles this differently, I'd love to know.


Rust's borrow checker can prevent data races, but it can't prevent deadlocks.

The Pony language has something analogous to Rust's ownership system (there it's called reference capabilities [0]), but in addition to that Pony prevents deadlocks too [1]. It does so by not providing locks, but only providing actors with message passing (channels). Now, Pony is a much higher level language and unsuitable for some of the low level stuff that Rust does.

If you restrict your Rust program to not use locks, but just send messages through channels between threads [2], you won't have deadlocks either. But that's quite a handicap for a low level systems language, so Rust in general can't commit to that.

[0] https://tutorial.ponylang.io/reference-capabilities/referenc...

[1] https://www.ponylang.io/discover/#what-makes-pony-different

[2] If you would rather prefer to write async code, there are async channels too, but to be deadlock-free you need to always guarantee that your futures are polled https://rust-lang.github.io/wg-async/vision/submitted_storie... which is maybe a strange requirement that's alien to most languages


You can trivially deadlock just with queues and messaging.


> It does so by not providing locks

> If you restrict your Rust program […] you won't have deadlocks either.

So Rust is directly superior, because it gives you a choice? I get it, there is something nice about having an enforced paradigm, so you're not forced to work with foreign 'bad' code, but I don't understand your argument:

> But that's quite a handicap for a low level systems language, so Rust in general can't commit to that.

Rust is fast, so it can't afford being slow. Pony solved this problem by being a slow, high-level language, so now it can block fast solutions and get away with it, because it's slow anyway, and therefore it's presented as a superior alternative to Rust - which can do the exact same thing, and have at worst the same speed as Pony, but we expect more from Rust (because it's better) therefore it's worse? Just doesn't make sense to me, sorry :D

Again, I see the advantage in the aspect of a language being simpler, and enforcing a paradigm on the ecosystem so developers have less head scratching to do when cooperating.

What's nice about Rust is that it enforces a safe paradigm at first, but over time one learns Rc, RefCell, unsafe "rustonomicon" to be able to optimize. In *the* Rust book, channels are taught first, immediately at the beginning of the chapter about threads.


Rust deadlocks just fine.


Rust can prevent races, but generally speaking it cannot catch deadlocks.


It seems you would have to use the `unsafe` escape hatch for data structures such as linked lists, and externally verify that the implementation is correct. Which is better than the status quo of C++ where you are always in needed of externally verifying safety.

As for the safety mechanisms discouraging non-standard data structures, the language designers probably would consider that a feature, rather than a bug.

I am curious whether the use of a broken unsafe construct can break the safety guarantees of safe constructs, or if the unsafety is contained no matter what.



> You just pass by value, return by value everywhere, all day long

This is actually how I've been writing PHP programs for years, passing arrays around like I just don't care.[3]

I think the PHP interpreter is smart enough to actually only copy-on-write, but I admit I've never profiled these programs. But anecdotally, I've done some pretty complex stuff[1] this way and it never caused noticeable performance problems.

[1] Complex as in 'wow I sure am making the computer process a lot of data, parse some DSL character-by-character, dynamically generate SQL and massage the results into some arbitrary structure for EACH PAGE LOAD, geez'. Not as in 'print hello world, but using Laravel', which somehow manages to be about 30x as computationally expensive[2].

[2] quip about how 'web artisans' spend the time between page loads brewing coffee using their CPU as a heating source while thinking deep thoughts. But I digress.

[3] Back in the PHP 4 days, objects, too, acted as if stored in variables rather than just pointed-to by them. Not that anyone expected language design excellence from Rasmus et al, but strikes me as a missed opportunity when they changed object variables to having reference semantics to have 'value, variable reference, or pointer-to-value' be a property of variables orthogonal to the type of value they store/reference. e.g.

  $a =  $b; // Copy $b's value into $a
  $a = &$b; // $a and $b are now the same variable
  $a = *$b; // $a points to the same object as $b, but unlike $a = &$b, the variables themselves are not linked


c++ interop always seems like some sort of magic power, given how few languages achieve it.


To be fair, I don't think even C++ is very good at C++ interop.


What language has C++ interop?


Haskell:

It is implemented as a library, without needing compiler support [*].

You can write C++ in TemplateHaskell (Haskell's strongly typed AST-generating macro system), which allows you to use Haskell variables in scope inside your C++ functions.

The "inline-c" library allows this for C and C++.

Example of its use in the Haskell OpenCV bindings: https://hackage.haskell.org/package/opencv-0.0.2.1/docs/src/...

The way it works: It invokes the C++ compiler for you to generate a wrapper function that closes over the Haskell variables used in your code snippet, and on the Haskell side generates a corresponding type-safe function call for you.

This approach allows you to use all C++ features that a C++ compiler supports. But it also carries the drawback of ... invoking a C++ compiler, which gives you the slow build speed of C++.

[*] Some compiler support was added to be able to specify compiler flags that are specific to the C++ compiler but not the C compiler.


Love the non-standard formatting :). It took me a while to see where haskell ended and C++ started. Seems very powerful.


D does. Not source compatible, but mangling and ABI for linking. And D can generate headers for `extern(C++)` declarations defined in D


Swift just added C++ interop in version 5.9. https://forums.swift.org/t/c-interoperability-in-swift-5-9/6...


Nim does as well. You can use the C++ backend and wrap a decent bit of C++. The imports aren’t automatic so you need to define the FFI, but you can call templates, implement virtual methods, call constructors, etc.

NimForUE takes advantage of this: https://youtu.be/Cdr4-cOsAWA


i've come across carbon (google's c++-next language), clasp (common lisp with an llvm backend), felix (compiler generates c++), and maybe a couple of others i'm forgetting.

also some languages have packages that deal with c++ interop to varying extents - see this discussion: https://www.reddit.com/r/ProgrammingLanguages/comments/huhy8...


Besides all other answers, .NET ecosystem, and IBM i TIMI, by having a C++ compiler as well.


> Statically compiled. Statically typed. Interops with C++. Memory safe. Typesafe. Data-race-free.

So it is for C++ what rust is for C?


Val has a similar but different strategy to the big problem as Rust. In Rust the reference semantics are lifetime checked (hence all the talk about a "borrow checker") but in Val there are no reference semantics.

In Rust there are some things you can't (safely) do because the checker can't see why they're ok. Val guesses that, with appropriate language features in place, you can extend that to everything and still have a useful language but now you don't need to teach this complicated and difficult feature.

Because Val is young it's not yet obvious whether this is basically always better, or whether it's too limited.


> Because Val is young it's not yet obvious whether this is basically always better, or whether it's too limited.

This is sound wisdom. It's not always obvious at the beginning what the limitations are going to be, and what the opportunities are going to be. If it's a promising direction, pursue it, even though it won't always work out.


Since Val seems to be liable to infer more about the actual data flow, I wonder how does it reflect on compile times.


The a typing feature[1] in the language Eiffel was based on the assumption of the author that the typing holes it apparently introduced could be statically detected. It turned out he was wrong after the language was released IIRC. I guess you have to be pretty certain of something before basing a language around it.

On top of that being a generally shite language doomed Eiffel.

[1] IIRC again, you could cancel features in subtypes, so "all birds can fly" but "penguins are birds but can't fly". That would break the Liskov Substitution Principle, which Bertrand Meyer thought he could deal with but apparently couldn't.


May many languages be as successful as Eiffel is.

Eiffel Software is still in business, whereas many others hardly reach any audience.


Very true. So many languages people haven't heard of. It's like if not in the top 10 or 20, some think they don't matter.

In regards to Eiffel, always felt that hindrance to wider adoption was more about the licensing. Still, they are in business and in use.


I know it is used in education in some parts of Europe. But now I'm wondering, is it used in production? Or rather, does it have a niche or an industry where it is more widely used? I'd be curious to know what software is written in Eiffel! (Mainly to get an idea of what Eiffel source code looks like, beyond toy examples)


As someone generally not in the loop on these kinds of languages, what is the “big problem”?


If another function / sub-routine / thread / whatever alters this Thing while I'm looking at it, lots of nasty surprises await. For example maybe the Thing is a container of Objects and I was looping over it, taking out one object at a time and sometimes calling unrelated_function() but for some reason unrelated_function destroys the Thing occasionally and then... Boom, use-after-free.

It turns out that although we've often thought of this as a variety of different hard problems, including "data races" and "use after free", they're actually all one big problem, that of mutating something while somebody else used it. Solving this effectively solves all of those hard problems, at least in a subset of your language where you are able to address it.

In a language like C or C++ it's easy to make one or more references to a Thing, and then give away the references, or the Thing, or both, and then the programmer loses track (or maybe never knew they were related) and these nasty surprises are their reward.

In Rust, their borrowing / lending metaphor prevents the surprise. If you lend a mutable reference to the Thing, the borrower can't destroy it, that's not what "lending" means, and you can't even use it until they've stopped borrowing it. If you lend immutable references, nobody can destroy it, or mutate it at all, until all those references are given back. But this does make the language more complicated because of the new metaphor.

In Val, you can't have any references, so unrelated_function couldn't destroy Thing, you've got Thing, so unrelated_function doesn't have Thing and can't destroy it. No surprises.


> Solving this effectively solves all of those hard problems, at least in a subset of your language where you are able to address it.

No, "solving" this by preventing it in the first place just makes programming way too hard.

In practice, using something while other people are using it is perfectly fine in most cases. E.g. most Python programs do that, and most don't have bugs most of the time.

The "use-after-free" problem can far more easily be solved by removing `free` (e.g. by using Garbage Collection).

Most concurrency bugs remain even if you solve all data races (e.g. as Python does, with GIL) simply because the semantics are wrong (e.g. having to update 2 counters which cannot happen simultaneously), or iterating through a list while modifying it (no data races here if done from a single thread!).


Python still has data races. Modifying variables is not atomic.


And of course the trick is to avoid the nasty surprises and do do it in such a way that is still is easy to program. If you can only do it by adding additional hoops for a programmer to jump through (like having to annotate all your function parameters with lifetimes), it's not ideal. Val's subscripts are a nice idea; they make writing a function that returns a reference quite easy. On the other hand, it doesn't look as flexible as a regular function, so maybe it has the same amount of hoops, just different ones.


I thought it was that Go is to C what Rust is to C++? This seems like another animal altogether.


Rust is very much aimed at the same space of Go where Go can be used to replace C.

I'd argue Rust is more aimed at the C space than the C++ space because of its accent on systems programming.


> ...interops with C++...

Does it? With what qualifications? Requires C++ that uses clang modules or something?


What are you quoting?

EDIT: I assume you meant to copy "ìnterops with C++". The answer is that it basically doesn't right now, but they intend for it to in the future somehow.

https://accu.org/journals/overload/30/172/teodorescu/


Sorry, yeah. Autocorrect. I fixed it.

And if that's a goal, fine, but if there isn't interop yet, we should be hesitant to describe Val as having C++ interop. Especially since C++ interop probably requires some compromises, such as requiring modular C++ in some form.


In a lot of cases won't the C++ compiler just inline methods and remove the copies anyway...?


Inlining functions and copy elision are two different things. If you inline a function without copying it's args, then it'll change the value of the original argument: Copy semantics lost!


Right. I wonder if copy elision can be ignored if functionality is verified to be the same? I'm just a hobbiest c++ dev so sorry for ignorance :)


does this mean your object tree is acyclic since you can't have references ?


Val will have statement level unsafe annotations (not even full unsafe blocks), according to the podcast linked in a nearby comment. Linked lists will be tricky or impossible to do safely, like in Rust.


I’ll have to see about this, but I don’t really like that.

Yes, you can replace a block of unsafe calls with a bunch of individually unsafe-annotated lines. But a good model of unsafe to follow is that whatever unsafety you expose within a block should be brought back within safety requirements when you exit the block. Put differently, an unsafe block should act as unit-safe and not leak unsafety.

If all you get is statement-level unsafety, you have no way to indicate at what point you’ve re-upheld safety guarantees.


cross platform build tools similar to cargo or go build are important to me.

battery included stdlib that also has network and gui is super attractive too


Part of me loves the advent of so many zero-cost-abstraction languages. The other part wishes the systems programmers would pick a winner already so that the already small community of systems programmers do not end up multifurcated into even smaller communities of Rust, Zig, Val, etc developers. I hope the systems language wars end soon in other words.


Maybe we picked one too early and didn’t have enough innovation in this area for too long. Bring the war :)


I like Rust, but damn is it complicated and very heavy. I really just want C with namespaces, actual enums (tagged unions are a bonus), a "modern" type system, and proper package management. They all have their place, IMO it's not that hard to switch between languages.



You're in luck, C++ covers that.


I think when someone says „I want a C that’s cleaner and has some simple, generally loved features“ they don’t think of C++?


Probably not, but they should.

C already felt primitive in the mid-1990's, and by C++98 most of those features were available.

So they can use C++ for those fixes, with its JavaScript/TypeScript kind of symbiotic relationship for better or worse, or stick to C and keep waiting for such language, that will most likely require parsing include files with some kind of tooling.


I didn't mention, but minimal is something I desire in a language too. Honestly, C++ is such a "design by committee" kitchen sink language that I have no interest in ever touching it. Yes, you can restrict yourself to a subset of the language, but then every time you need a library, or need to find some info on how to do something on SO or whatever, you have to deal with the rest of the language. You cannot simply ignore it because everyone else won't.


There is also a bit of committee on everything WG14 does since C89.


C++ has proper package management?


vcpkg and conan nowadays.

Not using one of them is making oneselfs live difficult on purpose.


Not even a war, right tools for the job. I don't think it's possible for one language to rule them all. Options are good.


I think at this point Rust is the winner just because it's already overcome the incredible hurdles of being shipped in the Linux and Windows kernels themselves. In that regard, I doubt there's room for any more of these new systems languages now.


I have the intuition that rust can’t possibly be the winner, it really feels too crude regarding the way it manages lifetimes. It really feels like the clumsy grandparent of the language that will indeed solve those problems elegantly.


But we made do with C for how many years? OP's question isn't whether Rust is as good as systems languages will get, the question is whether any successor language stands a chance of being included in the kernels.

The bad ergonomics of C were never enough to get another language into the Linux kernel, it took a language that solves the number one class of security bugs. It's unclear that any successor to Rust will be able to show as clear a need.


> the question is whether any successor language stands a chance of being included in the kernels

Uhh no. No that is not “the question”.

My question is “what language will I use to write video games”. C++ sucks and Rust isn’t a great fit.


I guess Jai aims to move into the gamedev space of this, but the language has a terrible presence online

https://inductive.no/jai/


I wrote a large blog post about Jai. https://www.forrestthewoods.com/blog/learning-jai-via-advent...

It’s a very interesting language that does a lot of things “very right”. But it’s also a long ways from being broadly available.


I like your style of writing :) - I watched a bunch of youtubers use it to do various things and was impressed but really I just want them to have a website and a way to sign up for beta easily so I can play with it too.


Rust is fine in an ecs engine like Bevy. Absolutely awful in traditional oop setups though.


Bevy is very cool and I follow it closely. It has yet to ship a meaningful project.


Not sure what you mean by "traditional oop setups" and why anyone should care about it, that's more of an outdated design pattern and a bad choice of the past than a desired language feature.


That's not how I read OP's question. OP was wondering whether a lang besides Rust can be "the winner" (unqualified) given that Rust has already passed the hurdle of being included in the kernel. It didn't sound like at all like their question was limited to whether there's room for another lang in the kernel.


But isn't it worth spending a few years poking around before we lock ourselves into fifty more years of bad ergonomics, now that things have clearly reached the point where there's credible motion towards a C alternative?


Rust's lifetimes have a steep (un)learning curve, but once you know what you're doing, they're fine. Really. IMHO Rust has ergonomic problems around effects or generic numeric code, but not memory management.

AFAIK no silver bullet has been discovered yet that would be an improvement over Rust that doesn't have some other trade-off. Val's mutable value semantics is more local and limited. It "solves" the problem of ugly lifetime annotations by not supporting complex zero-cost lifetimes at all. That doesn't mean Val can't be successful — it can be easier to use by supporting simpler constructs and focusing less on zero-cost abstractions, like Swift, but its ideas aren't stop-the-presses for Rust.

We're already overdue for having a more modern, practical replacement for C. Waiting for a hypothetical better-than-Rust language will only mean staying with C for even longer.


Ada already existed, politics and a free beer compiler ecosystem also helped.


But we made do with C for how many years?

K&R C was, what, 1972...and it's 2023 now. So that's 51 years with no definitive end of 'making do' in sight.


Same opinion here. Let us look at Val and Vale.

There is also Carbon and CppFront but those are more of C++ evolution than new-born things.


Like what happened with Java. Once a language has accumulated enough popularity it’s extremely hard to depose it, due to network effects. The switching costs are enormous.


The role of Rust in the Linux kernel is so far extremely limited, it's more of an experiment than a production tool at the moment.

I don't think that the "window of opportunity" for another language in the kernel has closed. If anything, Rust has shown what would be the hurdles to overcome, so it may inform the next attempt if it happens.

And of course Linux is not the only kernel out there, and new kernels can also arise.


Won't happen. But all the more reason to expand wasm + wasi support everywhere.


Are those languages able to do low-level hardware things?


Wasm is WebAssembly, which has a nice compact set of instructions to compile to and network effects from being the 4th language of the web to ensure its staying power. WASI is a WASM system interface for connecting with hardware and other systems:

https://hacks.mozilla.org/2019/03/standardizing-wasi-a-webas...


It rather looks like WASI interfaces to code which is written in one of the other system programming languages that are the topic in this thread, so I guess the answer to the parent commenter's question is "no"? Am I misinterpreting this?


If you want to be pedantic, sure. The point is whether those interfaces are written in C or Go or Rust that they are interoperable, and a dev can write their core logic with a wasm compilation target in Zig, Val, or whatever other language comes up. Hardware and OS devs can write WASI modules in whatever language they prefer so that others can consume.


I don't think this is being pedantic. We here talk about languages like Rust, Val, and also C (which we attempt to replace) as languages to interface directly with the bare metal. Think device driver or a kernel itself.

To me it initially looks like WASI does not let you use WASM as a bare metal language (I don't know enough about WASM to judge whether that's even really sensible). Instead, you have a layer on the bare metal (or even several layers above) implemented otherwise, and then you can use WASM still further above, to interface to that layer.

But again, maybe I misunderstood.


Personally, I don’t think you’re being pedantic at all either because it seems to me that the web assembly stuff is less capable in comparison to things like MLIR. It also required and maybe still requires being hosted in a JavaScript runtime, i thought.


No longer does Wasm/WASI need JS host! There are many spec-compliant runtimes built for environments from tiny embedded systems up to beefy arm/x86 racks:

- https://github.com/bytecodealliance/wasm-micro-runtime

- https://github.com/bytecodealliance/wasmtime

- https://github.com/wasmerio/wasmer

- https://github.com/tetratelabs/wazero

- https://github.com/extism/extism (disclaimer, my company's project - makes wasm easily embeddable into 16+ programming languages!)


thanks for the links : i'm reading them and didn't know of those updates


Not really, unless some future hardware supports such a beast, could be an interesting approach for AI/WebGPU interfaces.

It's more for service-oriented applications currently... you pass stuff in, get stuff out.


My emotional journey with Val having just learned about it today:

Oh neat, a new systems language, probably nothing but let's take a peek. Docs look legit. Hmm some thoughtful ideas in here around ownership. Syntax makes sense. But is it different enough to justify its own existence? Who makes it?

Oooh, Dave Abrahams is working on it. We crossed paths at Apple and I remember his Crusty talk about Swift [1]. It was great, loved the strong opinions, but Apple sadly removed it years later because it had some outdated advice. Wait, he's at Adobe now? So is this an Adobe language?

Conclusion: keep an eye on it, will watch the linked talks, wait and see.

[1] edit: I found the Crusty talk! "I don't do Object Oriented!" https://www.youtube.com/watch?v=p3zo4ptMBiQ


Before his time at Apple, David Abrahams was very prominent Boost contributor, a member of the C++ committee and a strong proponent of Stepanov-style generic programming.

He ending up in the Val project is not surprising!


Ha, I really loved this style of describing the journey. More people should write like this :)


As a compiler developper (yeah _I'm something of a_ ...) I'm shoked to see

- https://github.com/val-lang/val/issues/758

- https://github.com/val-lang/val/issues/711

That smells bad implementation. You should self-host ASAP guys, you'll find more basic bugs like that. Yet 500+ stars !


As another compiler developer, there are few mistakes more damaging than self-hosting too soon. A language will be a net benefit for small and medium programs long, long before it's a net benefit for (and has the tooling for) a project as big as a compiler. I'm glad they're not rushing it.


Bugs like that in a production compiler could be worrying. In a compiler for a language still being defined, not so much. Compare Rust 1.0 vs 1.71 to see how much of a gulf there is between "first stable release" and "compiler of a stable language with lots of real world usage".

Self hosting is also not the be-all and end-all. It is a symbolic milestone, more than anything. Achieving it is something to celebrate, but not something to hold against the language until much later on their development cycle, if at all.


Swift itself still doesn’t self host and is approaching a version 6 release. I’m not a compiler developer but the outcome of all the discussions around this topic in the Swift community was that it wasn’t worthwhile yet.


Where was that discussion held? I'd like to read what people were thinking.


There's quite a few discussions around the topic here forums.swift.org


I've had "playing with val" on the back-burner for a while. Finally tried to today, only to learn that even after 4,000 commits it's still a long shot from usable. Most of the examples from the language tour don't compile, even seemingly simple ones.

Carbon has yet to deliver on anything yet, even as Sean Baxter has been making lots of progress on the Circle compiler (including implementing lots of the good bits from Carbon).

Hard to be a successor when you can't really be considered a language yet.


So now we have V, Val, Vala, Vale. Any other language I'm forgetting?


There’s also VAL, one of the 80s era languages that influenced Haskell (via SISAL). Alas, outside the parallel languages community I don’t think many people are familiar with that VAL.


Variable Assembly Language!


It will work to their advantage. Anyone learning one of these language is looking for "the new hotness", and will probably become aware of the other similarly named languages in the process, and will thus look them up. JavaScript did it, how did that turn out for them?


Javascript was the only game in town in the browser for cross-browser support.

The name didn't matter at all, it could have been named Fooscript and still be popular.


Coulda named it after a skin disease!


I legitimately thought this was Vale. Rather unfortunate collision in naming


They are trying to concentrate safe systems programming in permutations of V +{a,l,e}


There is a Beef programming language ... what about Veal?


Hahaha, it would fit perfectly. We should open an issue in Beef programming language issues list to change the name because it is violating the V +{a,l,e} rule. :D


We all just have similarly good taste in naming =) party on this end of the alphabet!


Bonus: V is also known as Vlang and Vale used to be called Vlang.


Yep, then today's Vlang got to that name first, so we had to call it Vale instead!


and you can code with them inside vi or vim!


Volt


Valtown


Verve..? Virgil for sure.


Vau comes to mind.


Only allowed if it's fexpr based though.


It’s brought up every time, but the only thing I don’t like about Val is that Vale and V are two other systems languages that are also new. Those three can so easily be mixed up because of naming.


There’s also Vala. A somewhat different niche, but one more similarly named language.


Had to go to the link to confirm whether this was the projection language or the region language.


There's also VAL: Variable Assembly Language https://en.m.wikipedia.org/wiki/Variable_Assembly_Language


Related. Others?

Val, a new programing language inspired by Swift and Rust - https://news.ycombinator.com/item?id=31788527 - June 2022 (19 comments)


I've skimmed through the "mutable value semantics" paper: do I understand correctly that authors suggest replacing references and pointers with either: a) nested structs (with some indirection via `Any` with runtime-checks); b) array indices?

The following phrases got my eye: "It is reasonable to ask, then, how we can use mutable value types to represent self-referential data structures, such as doubly linked lists and directed graphs. In fact, any arbitrary graph can be represented as an adjacency list. For example, a vertex set might be represented as an array, each element of which contains an array of outgoing edge destination indices.". I don't see how one can reasonably implement doubly-linked list without re-inventing memory heap and some kind of garbage collection in the implementation.

UPD: found some discussion at https://github.com/orgs/val-lang/discussions/736 , seems that there will be some escape hatches akin to Rust's `unsafe`. That resolves all issues, and "whether the safe subset of Val is enough for reasonable application" question is open to long years of debate.


Indeed, you can reinvent memory by using arrays, and using array indices instead of pointers.

It's much less silly than one could think, because these indices are locked to your particular data structure (doubly-linked list, graph, etc) and cannot be manipulated to access anything else in the program, at least not easily. This applies bot to the compiler checking that, and to the attempts to crack the program.


> these indices are locked to your particular data structure

Are they, though? Consider I have a linked list (or a map/dictionary, or any data structure that allows removal of arbitrary elements). I add elements with indices 1, 2, 3, 4, 5. I remove elements with indices 2 and 4. I'd like to make memory consumed by these elements available again. At the same time I don't want to change indexing scheme of the whole data structure. So I have to keep "holes" with indices 2 and 4 at the very least. After few million operations that may become very inefficient.

In a typical language, that would be resolved by having a global allocator that allows indices 2 and 4 to be reused in neighboring data structures. If I don't implement a global allocator that shares indices between objects (as that kills the whole point), I consume more memory than needed and rely on my own clunky implementation of garbage collection/defragmentation, don't I?


I think lists as arrays is just example of what can be used instead of pointers. Also section was titled "representation something" so probably it is just one possible low level representation of linked lists. And Val can have standard library of lists, including represented as arrays with build in online reindexing. But this makes Val high level lang so not really system programming language...

But what irks me is: why not just program in plain C with passing structs by value ?? All aliasing will be gone and probably compilers already know how to deal with that. "Becouse somehing_hard_to use is included" is just dumbing thing down and not gaining much. Lack out of bound indexes checking ? That's like one function. Call it everywhere :) Programmers need to learn dealing with reality (what features CPU's gives) instead of being castrated. Becouse moving problem somewhere else do not annihilate problems. Maybe we need high level CPU's and some hw level guarantees some engineers can bake in :) But that can end like hardware on call register saving :)


From the example on their home page:

    ...no unnecessary allocation occurs.  The result of *longer_of* is a projection of the longer argument, so the mutation of z by *emphasize* occurs directly on the value of y. The value is neither copied, nor moved, and yet it is not being passed by reference to emphasize
I'm not sure how to read this. A string arg is supplied to a function and a character is (or characters are) appended to it. How can this happen without a new copy of y, since that string must have had an initial lengh (and therefore, a specific place and size on the stack)? Do they create strings with extra padding at the end, just in case someone might append to them? (If so, how much padding and why isn't that generally less efficient overall?)

Edit: I didn't realize that asterisk bolding didn't work in quotes. Decided to leave it in anyway.


The four space prefix is for code/monospaced blocks, not really for quotes.

Generally on HN people (myself included) seem to use the '> ' tradition for quoting, which the parser doesn't treat specially but works out fine.


Ah you're right and I'm not new to HN formatting. Was just late and I forgot.


Just because it is an immutable value doesn’t mean it needs to live on the stack.

That said, I was wondering about a similar thing: what if I modify a 1GB string? Is it copy-on-write?

I feel like it can be optimized away 99% of the time only to leave you with 1% of cases that result in very-hard-to-find performance degradations.


> Just because it is an immutable value doesn’t mean it needs to live on the stack.

Right. But the context of the example made me imagine it probably was -- since I would likely reject out-of-hand any new language that always allocated variables on the heap.


To answer my own question a couple of days later, I suppose that if they secretly shadow the mutated var name with a new longer value (in this case) on the stack -- perhaps that is what is meant by "a projection". In that case, no new allocation would occur, since the stack is already allocated. I'm not a language or compiler expert, so maybe this terminology is well know in those circles.


There's more of an explanation further down:

   To better understand, notice that longer_of is not a function; it’s a subscript. A subscript does not return a value, it projects one, granting the caller temporary read and/or write access to it.
I'm not really sure how this doesn't fall under a move? Regarding the string, they could also place it on the heap to allow for the variable sizing, which is what Rust does.


The thing I haven’t seen yet that makes me raise my eyebrows is sharing one allocation between two data structure. That’s where things get really gnarly with Rust references. The borrow checker is extremely easy to deal with until you start trying to store references in structs, yet the examples I’ve seen with Val don’t show that.

Maybe these cases can be simplified, but I’m very skeptical. It’s a fundamentally difficult problem domain considering that these cases are also very difficult to reason about in C/C++. Questions like: will this request last longer than the current thread?


Unless I got mixed up, from what I remember of previous article it’s because that’s not possible. Val’s references are not first-class, you can’t return or store them.


edit: these comments are general, not directed at Val.

A humble suggestion to anyone coming up with a new language.

Please make it as familiar as possible.

For example, do you really need to come up with some new syntax for defining a function, or specifying types and returns?

If you want people to use your new language, reduce the cognitive load - do things the same as other languages and be different only where you must.

Every time you come up with some completely alien programming construct, you have set a barrier to learning. Coming up with a new language should, for the most part, be an exercise in copying the best of how other extremely popular languages do things, and on top of that build in the necessary differences that make your language special.

Also, less is more. For example, one of the reasons Rust is so complex and intimidating is it has six sublanguages https://gist.github.com/brendanzab/d41c3ae485d66c07178749eae...

    the expression language
        unsafe runtime language
        safe runtime language
        compile time language
    the type language
    the trait language
    the macro language
    the attribute language
Whereas Zig goes the other direction and attempts to make everything programmable in the one core language, even ditching macros entirely. https://ziglang.org/learn/overview/. In Zig you can write compile time code in the same language and same code base.

I put forward these suggestions not as a language edxpert - I am far from it - instead I put them forward as a frustrated learner wanting to do new things and looking at new languages and thinking "shit, I already work with three of four languages plus any number of subtechnologies on a given system, I just don't have time/headspace to learn a programming language with six sub languages".


I disagree. Syntax is the thing everyone loves to comment on because of the bikeshed effect, but it's probably the quickest aspect of a new language to learn. If you're already creating a new language and have the opportunity to fix some things that make older languages harder to remember, parse, or type, why not take it?


Alternatively, where the semantics are different to existing languages, try to make the syntax different too. Languages which look the same as existing ones and behave differently are pretty rough going for newcomers and anyone programming in multiple languages.


One reason I prefer Erlang to Elixir is that I appreciate the fact that Erlang syntax is very different, and I feel like it helps me think in Erlang. I don’t think about Python or Ruby.


I probably should have said "did you really need the syntax and the behavior to be different?"

My point is not so much about syntax, it is about unnecessarily doing things in a conceptually different way.

If different behavior at a given point is core to what makes your language special then sure thing, make it behave different, but implementing some different behavior without a significant payoff only makes your new language harder to grasp.

Presumably you are making a language because you have some core concepts or beliefs you want to bring to life - all I'm saying is - make that happen, and make the rest as familiar as possible for most programmers.


Reminds me of Bjarne Stroustrup talking about how every time they add a new feature to the language, it's unfamiliar to anyone so everyone wants it to be REALLY LOUD AND OBVIOUS. "Stand back! I'm going to use a LAMBDA here!" But, after a short time, the new features are so common and familiar that everyone starts complaining about how loud they are. "Why is the syntax for lambdas so noisy! [](){}! Gah! Can we at least make the () optional to quiet it down a bit??"


They do have a good reason to do what they do though. They explicitly say that they’re going for a language where you can pretend there are no references and things are just mutable values. And you can’t really do this without coming up with new names. Because of course there are references under the hood.

If Rust goes the explicit modifier route, e.g. something can be & and also mut, and you need to know what these are, Val goes the implied behavior route where a ‘set’ parameter can be set, while a ‘let’ parameter can’t.

If they didn’t come up with new semantics, they’d just be making a Rust clone. I for one am very looking forward to how Val develops. Currently it seems like it can offer similar guarantees to Rust while using a simpler mental model to deal with.

I’m curious to see how they are going to tackle concurrency. Their roadmap outlines that as one of the trickier parts given the semantics that they’re going for.


I never get this argument. The syntax is important for consistency, but in another sense is trivial. It's the smallest thing about a new language that you have to learn, I don't understand why people get hung up on it.


I often find people get very hung up on it because they don't have much else to talk about. The look and feel of the syntax is the least important part of a language but its also the easiest to complain about.


This syntax looks almost exactly like Swift, which is also where it borrows its semantics from.


Yeah, it feels like Swift with lifetimes (which are coming to Swift). Other than the abstract benefits of mutable value semantics for generics, I wonder what would make one reach for Val over Swift (or going the other way, Rust)


Adobe likes Swift, but doesn't think it will ever strive outside Apple's ecosystem, as it was it Objective-C as well.

Additionally Dave Abrahms would like to have another go at designing Swift.


Wait a second, isn’t Mojo supposed to be the Swift but better language- there’s another one already?

https://news.ycombinator.com/item?id=36281245


Surprise it turns out various Swift designers have different points of view.


What's unfamiliar about Val? It looks very similar to most languages released in the past 10 years with minor changes.


Clearly, this is why Rust has been unsuccessful. /s

But seriously, if you aren't doing something really different, why bother. If you find Popular Language X (PLX) too limiting, there's probably a dozen or two "it's just like PLX, but with a sprinkling of things from Y, which is pretty much like PLX, but sweeter sugar, and that makes it perfect". As others have pointed out, syntax is trivia[1].

[1] Unless, of course, you think a BEGIN..END language is nifty, and then you're a heretic and someone is coming with torches and pitchforks. :-)


P.S. - I'm not talking about PL/X, which is a thing, because of course it is. PL/I with some PL.8 and PL/S, and probably some PL/M. Nope...not joking.


I disagree with this a lot.

I think most programmers have been trained to think that C like syntax and semantics is fundamental to programming languages, as if languages are just inherently C like, and that C is just what programming is, but this is not true.

Programming language theory is an entire field of study on it's own, and there are a lot more possibilities than just rehashing C for the nth time.


> The result of `longer_of` is a projection of the longer argument, so the mutation of `z` by `emphasize` occurs directly on the value of `y`. The value is neither copied, nor moved, and yet it is not being passed by reference to `emphasize`. The body of `emphasize` owns `z` in exactly the same way as it owns `strength`, which is passed by value: `z` is an independent value that can only be touched by `emphasize`.

This evasive phrasing, which continuoes after this excerpt too, has me highly skeptical of their good intentions… Any good reason they are not more explicit?


As someone who has deliberately avoided C++ for the last 20 years (in favour of functional programming) I find the discussions of "reference semantics" and "value semantics" (let alone "mutable value semantics") to be quite opaque. It is as if the C++ community has become an enclave of folks who put up with the extreme complexity of C++ and speak a correspondingly tortuous theoretical language.

What the seem to be saying here is that the "subscripting" operation returns a view into its argument, not entirely unlike the concept of a lens. The only thing that view can be used for is directly accessing the the part of the value that is in focus—the view is not itself a first class value, which means that so-called "reference semantics" don't come into the picture.

I don't think they're being evasive or promoting their idea in bad faith. They are just operating in a characteristically arcane way for C++ language design people.

The following blog post helped me start to grasp what this aspect of C++ talk is actually about: https://akrzemi1.wordpress.com/2012/02/03/value-semantics/


Here's an example of how C++ terminology gets developed, from Stroustrup's own website: https://www.stroustrup.com/terminology.pdf

To me, this seems like a proliferation of distinctions and enthusiastic theorizing. Finding solutions which actually simplify the task of programming and/or clarify matters seems a long way from this attitude.


What evasiveness? To me it is very clear what is going on.


The human equivalent of "it works on my machine".


I'm very happy to see continued investment into mutable value semantics. When I was writing Swift, it felt absolutely magical to have most of the guarantees of functional programming without the performance penalty and intermittent awkwardness.

Val just doubles down on the ideas in Swift and goes so far as to remove references altogether. This is a really interesting space.


How come this page reaches top of HN ? Has anything new happened to this language ?

I was hoping there was because i liked the idea when it was announced a year ago, but the page doesn’t seem to show anything new.


It's like the second-hand clothing store in my town that used to be called "New to you".


Now it has official backing from Adobe, if you listen to ADSP Podcast.


For me I'm interested in the type system provided by a language.

For Val, I was a little hesitant when I saw the docs say "Two or more types can form a union type, also known as a sum type." since a union type and a sum type are similar but distinct concepts.

For a language, it is very important to understand the differences between these because I think they will greatly impact the design of the language.


This is one of the better (and better looking) landing pages I have seen for a programming language. Very good copywriting too.


I agree. I had to look up all the other new languages that start with V, because I feared missing out, and I liked Val's page best.


I love that we see new languages all the time, enabled by reusable toolchains like LLVM etc. but I'm always disappointed how close they are to existing languages.

It's understandable why a company like Apple would seek Swift to be familiar to developers, but I don't know why new efforts that will be building a community from scratch would do this.

We need more imagination and more bold deviations from the norm, if any of these languages will make sense to pick up for a dev out there.


You are phrasing it like a universal truth. If this is important to you, do something about it. Otherwise you are just complaining.


I'm doing something about it.

But I miss an ecosystem of people doing something about it. This variety is required for healthy progress in a field. So people can inspire one another, and lead to significantly new efforts. One instance is nothing, despite of course everyone believes they're making the language that'll take over the world.

I often open a new language's site, anticipating innovation and how what I do compares with them. And it's always disappointing to see "it's like X but slightly better" because "X but slightly better" has no chance to overtake an incumbent, and also it shows attitude where we think all there's to discover has been discovered, and it's just tweaks and patches now on, with a new brand label on top.

Nothing can be farther from the truth.


But there are ecosystems of people doing that research into significantly new efforts. I'm thinking of Idris and Agda, for example.

> We need more imagination and more bold deviations from the norm

This is what I reacted on, a bit too compactly - apologies. Let me unzip.

My problem is that this statement 1) feels so far removed from reality, 2) I'm so much interested in this space that this hurts.

Why would you recommend others to throw out the good ideas of decades of research just to be hip? Evolution doesn't work that way. Humans didn't evolve by throwing away state-of-the-art apes. Evolution happens by changing one or few bits at a time and iterating while selecting for the good. This means there will be many programming languages that look just like those before them - at least to the untrained eyes.

Mutable value semantics is one of the most fascinating ideas in programming. It's like the good parts of functional programming without the bad parts. I'm pretty certain it will revolutionize programming. To dismiss it as being close to existing languages blows my mind.

I get that it's hard for any language to take hold without being 10x better. At the same time, it feels naive to build a 10x better language without keeping the good parts of existing languages. Not removing good ideas is as important as adding good ideas.

Also, humans are 'stateful', switching costs are real. 10x better not only includes the upsides but also the downsides. Hence, it feels like a sane strategy to not only maximize the upsides of new ideas, but at the same time minimize the downsides of not only removed good ideas but also changes in general, to minimize switching costs.


In addition to Val and Vale, there is also Vala: https://vala.dev/


"Our goals overlap substantially with that of Rust and other commendable efforts, such as Zig or Vale" - oh well, but not with Vala, I guess?


Val overlaps with Vale but not with V and Vala. :D


> Integer numbers are typically represented by the type Int, which represents a machine-size integer (usually 64 bits on modern computers).

I'm not a systems programmer, but I thought there was a consensus these days that this was a Bad Idea™? Are the benefits of marginally better performance on certain platforms really worth having your program break due to an overflow when you run it on a different platform?


Zig has both fixed size types and a native-pointer-sized type. I think it's the right approach, in general you need both.


I'm thinking we need a permissions system for programming languages like we have on smartphones, hear me out. People can't trust every library out there and all of their dependencies. So what I want to do is restrict every library by default, and refine permissions on an as-needed basis. For example, why would a handy library that I grabbed from a random github repo have access to my entire harddrive? Or why would would it need to access the entire internet? Or why would it need to access memory that is owned by another library?

For me, this would be an essential part of being a "safe" language. And I bet this is the kind of safety we need more than just memory safety.


Check out Austral.


Cool, thanks!


Anyone got an opinion on Embedded C++, a C++ subset/dialect?

https://en.wikipedia.org/wiki/Embedded_C%2B%2B


Ah yes, the languages that eschews namespaces and static,reinterpret_cast, major sources of issues on embedded platforms.


Still better than plain old C, and used as inspiration for IO Kit allowed subset.


Really interesting concepts here, especially subscripts (which neatly solve an issue that RAAI semantics or lambdas are usually hacked into solving).

Is my reading of overloads correct? Identically-named methods are supported, but only for different argument labels and/or calling conventions?

Rust convinced me that dispatching based on argument types is a poor idea, because Vec::new(10) could only be understood by reading the documentation at least once - whereas Vec::with_capacity(10) is abundantly clear. Val seems to gain these benefits (assuming that it doesn't dispatch based on type) without the drawbacks.


You're correct; Since argument labels are really part of the function name, you can say all function names must be unique


What about generic code?


Can you really have a systems programming language where the runtime cost is not apparent from the code and stable?

Having some code suddenly gain a gigantic copy in an inner loop because of a far away change seems like a risk in val.


It seems to me the lack of references would prevent any use in contexts that are talking directly to the hardware (which is typically memory mapped). I’m thinking of microcontrollers or device drivers. You could presumably patch it over with unsafe, but in those contexts that would be the majority of the code, which defeats the purpose imo. I guess they are targeting a different market, but that seems like a big downside for a systems programming language.


Do these terms (“subscript”, “projection”) have a deeper meaning - maybe from another discipline - or are these terms arbitrarily chosen to explain the abstractions of the language?


“Projection” is often used in functional programming to mean a function that picks a value out of a tuple, e.g. `fst (x,y)` returns x (the first element of the tuple) which is like projecting a vector onto an axis.


Projection is also a similar (but more general) operation in relational algebra (i.e. SELECT in SQL).


Very straightforwardly: the two terms come from mathematics.

"Subscripts are often used to refer to members of a mathematical sequence or set or elements of a vector. For example, in the sequence O = (45, −2, 800), O₃ refers to the third member of sequence O, which is 800." (Wikipedia)

As fanf2 said, projection in software typically refers to selecting one element of a tuple.

"[Projection is] An operation typified by the jth projection map, written projⱼ, that takes an element x = (x₁, ..., x ⱼ , ..., xₙ) of the Cartesian product X₁ × ⋯ × Xⱼ × ⋯ × Xₙ to the value projⱼ(x) = xⱼ" https://en.wikipedia.org/wiki/Projection_(mathematics)


They are meant to be evocative, but it is best to consider them as arbitrary tokens. There is no close analog in other languages.


Basic question, but if there are no references and it’s all values all day long, then how do strings, arrays, and other types where the size unknown at compile time work?

I saw the example they gave, which is cool, but they were with string literals. So, per my question above, how would the same program work if the strings came from user input?


In my understanding, there are still pointers under the hood, they're just not exposed as such in the type system. If you can't copy a pointer you can't alias, so they're indistinguishable from the value itself in practice.


This part is exremely confusing: https://tour.val-lang.dev/bindings#lifetime Why a value declared with `let` has a lifetime? Is it because they want to avoid copying objects and data structures?


To avoid shared mutable state, you either need copy-on-write (hence shared but not mutable) or mutable but not shared.

I understand this is the second approach.




Or the other Vale! https://vale.sh/


Nor with Valgrind https://valgrind.org/


Yup, this one has me seriously confused. They even both seem sort of similar in scope...


I often contemplate whether programming language designers consider the developer experience during the design process. Merely perusing the sample code on their web page causes discomfort in my eyes.


> mutable value semantics

Isn't this the same effect as we get when using Copy-on-Write, e.g. with the implicitly shared container classes Qt offers?


> Val is compiled ahead-of-time to machine code

... to which architecture(s)?


Val is using LLVM.


How things can go down on HN strikes me as odd, and this seems to be an example. "People" suddenly seem to be quite supportive of: 1) Has not released an alpha yet and after a year. 2) Website that promises a lot (like 2-way C++ interoperability). 3) Clearly telling people it will have trouble supporting the Windows OS. 4) Has relatively limited stars and contributors on GitHub. 5) Wishlists and promises are suddenly fine by them.

Then for other languages its media blackout, hate, and an avalanche of trolling. I suppose the hidden machinery and utility accounts don't come out, unless "they" perceive an actual threat to its position.

Don't get me wrong, love new languages and seeing more competition. People should follow their dreams. Just weirds me out, as to how reactions and perception goes.


People were rightfully critical of V for over promising and under delivering as well as misrepresenting the capabilities of the language.

Val has promised a lot and hasn't delivered anything, so I think it's fine to be interested/ excited but have a heavy dose of scepticism as well.


It seems V is now delivering quite well.


Not only did detractors flag away my previous response that agreed with your point about how V delivers, but based on their MO (preoccupation with 2019, same catch phrases, and constantly spamming the same certain sites), it's likely the same group and their dedicated accounts for such purposes. They are trying very hard to push a negative narrative and eliminate the public viewing of any counter or supportive arguments.


[flagged]


[flagged]


> You're a known troll spamming

Can you accept even one criticism? Everyone who dares to say anything negative is a troll in your book.


You were told this by the mods of the website.

By the way, the array hack was fixed, thanks for reporting.

If you find similar bugs, don't hesitate to report them.


[flagged]


[flagged]


> the creator of V had a clear and obvious pattern of overpromising and overhyping things early on

can you please list any examples here?


[flagged]


The Vlang creator's initial misrepresentation of the language and their very problematic and hostile handling of any questions regarding the design of their language means, for me, I will never use it and don't consider it a serious language.


Can you please list here examples of under delivering and misrepresenting?


Please don't take HN threads further into flamewar on this topic. It's extremely tedious, it's happened dozens of times already, and we don't need any more of this.

I'm sure your programming language is just fine. What's far from fine are these flamewars. Everybody, on all sides of this, needs to stop doing it on HN. Other forums are fine with flamewars; please conduct them there instead. It's not what this site is for, and destroys what it is for.

https://news.ycombinator.com/newsguidelines.html


As you're well aware, there were a series of articles written about this[0][1] which were popular on HN a few years back. This is well documented, reproducible, and done in good faith.

Now, I just want to make it clear that my exact wording here was "people were rightfully critical of V" and not "people are rightfully critical of V" (emphasis added). My intent is not to malign the current state of V, because 1) I do not keep up with the language and 2) it isn't relevant to the point being made.

The parent comment was drawing a comparison between "other languages" (which, by now, it's apparent that they're speaking primarily about V) and Val when it comes to HN's reaction. If we're going to compare, it only makes sense to compare them at similar stages of development / release. So, in this case we're talking about early release V and what was promised versus delivered. It does not matter what is being done in service of users now because the criticism I'm referring to isn't from now, but when the initial HN impressions of V were formed.

[0]: https://xeiaso.net/blog/v-vaporware-2019-06-23

[1]: https://xeiaso.net/blog/vlang-update-2020-06-17


These articles have stuff like "V depends on git and electricity" and benchmarking debug builds of V with a slow backend. The author also said that V must die.

Can you please list here actual examples of under delivering and misrepresenting?


[flagged]


[flagged]


[flagged]


[flagged]


This is not the main site (vlang.io). Such claim was never made there. The OP mentioned that it "was on the list of 'features' for a long time".

The HN comment you referenced is from 2019. I was very anti-GC back then, that's why I didn't want to have a GC, but a model similar to Rust. After playing with GC, I realized it worked really well with V, and offered a better performance.

There's an article about these decisions on V's blog. They can change over years, you know.

If a project leader changes their mind on a technical decision over years and explains the reasons, that doesn't make them a liar.

The end goal is having good performance, flexible memory management, minimal resource usage.


[flagged]


[flagged]


You have a history of posting flamewar comments to HN about this topic. If you do it again, we will ban you.

I'm not going to ban you right now, because you haven't been posting exclusively about it - but these flamewars are unacceptable. No more of this please.

https://news.ycombinator.com/newsguidelines.html


You have a history of posting flamewar comments to HN about this topic. If you do it again, we will ban you.

I'm not going to ban you right now, because you haven't been posting exclusively about it - but these flamewars are unacceptable. No more of this please.

https://news.ycombinator.com/newsguidelines.html


> Then for other languages its media blackout, hate, and an avalanche of trolling. I suppose the hidden machinery and utility accounts don't come out, unless "they" perceive an actual threat to its position.

? I dunno where you’re getting this from.

Pony? Zig? Crystal? Carp? Taichi? Dex? Vale?

The reception has been generally positive imo? Vale got some complaints for being very rough, but the reception was pretty much ok?

I mean, those are just things I’ve seen flow past on HN, so maybe others drop off and never see the light of day, but that’s true of most links on HN in general.

To rise up and then get heaps of negativity seems unusual for a programming language?

You sound quite “conspiracy theory” with this, and I think you’re sitting in a weird echo chamber.

If anything, I’d say the big difference is that this is broadly speaking a technical audience; a language that is not pitched at a technical audience (or exaggerates for marketing purposes like getting funding) will just get downvoted as spam.

Val has an open source compiler you can look at right now (not vapour), clearly points out it’s not ready for use, but has big ideas.

Doesn’t seem like you need a “deep state” to be enthusiastic about competent people building a new language, and having a chance to push it in a direction you personally want as it evolves.


> You sound quite “conspiracy theory” with this...

It's not based on conspiracy or throwing out a quick unresearched opinion to just smite someone. It's based on search, and reading posts made on HN (over the years).

> (exaggerates for marketing purposes like getting funding) will just get downvoted....

> (not vapour), clearly points out it’s not ready for use, but has big ideas...

The difference is, being consistent with handing out such punishments (downvoting) and giving negative labels (vapor). Treat all the languages the same or equally apply those rules, otherwise the hypocrisy shows itself. Don't be so angry or surprised that someone may comment on it or point it out.

And when using terms like "vapor", let it be one's original thought, and not what is supplied by others painting a certain narrative.


> The difference is, being consistent with handing out such punishments (downvoting) and giving negative labels (vapor). Treat all the languages the same or equally apply those rules

???

Isn’t that literally what I’m doing with my examples of like 6 different programming languages?

> Don't be so angry or surprised

I apologise if I somehow came across as angry; I’m simply stating a fact.

There is no conspiracy.


Communities aren't monoliths.

The groups of people who criticize real tangible languages and people who have the skills to critique the design of abstract language design don't necessarily overlap.


> Then for other languages its media blackout, hate, and an avalanche of trolling [...] > Just weirds me out, as to how reactions and perception goes.

The prog lang world is a massive circle jerk, with plenty of toxicity, extreme fanboy-ism and haters.


Hard to argue against the truth or the reality of it.


Being supported by Adobe, having Dave Abrahms and Sean Parent, kind of gives it bonus points.

This is not a random language.


This seems like a kneejerk reaction to the topmost comment not being critical. There's plenty of critique and skepticism if you read past the top, be it the name, parts of the docs, whether their mutable value semantics work, etc.


It seems to be a coin flip. Whichever comment gets upvoted early on continues to be upvoted and becomes the consensus.


I think the interest in this project is also peaked by the fact that some of the people behind the project are quite well-known and respected members of the C++ (and Swift) communities.


[flagged]


Please don't take HN threads into flamewar. We've already asked you this once and we have to ban accounts that keep doing it.

We detached this subthread from https://news.ycombinator.com/item?id=36779614.


my comment in no way takes the thread into a flame war, and i'm pretty frustrated that it's being judged that way


It's a classic programming language flamewar comment. This is not a borderline call!

The only other people who will care about such a post are the people who feel as pro-$LANG and/or anti-$LANG as you do - and then the two tribes will bash each other with clubs. This dynamic has been known at least since Usenet and it's very much what we don't want on HN, so please don't do it here.

https://news.ycombinator.com/newsguidelines.html


Why?

e.g. Here's DOOM running in V (translated from C). Seems pretty serious:

https://www.youtube.com/watch?v=6oXrz3oRoEg

Also, can you look at this recent 0.4 changelog with 370 items and say the language is not serious?

https://vlang.veery.cc/post/725/v-0.4-is-out


[flagged]


Nonsensical, including telling someone else which languages starting with the letter "V" can't be on their list.

As for reading the issues and pull request section, just about all languages have them. So everybody must be "blind", especially younger languages. Which comparatively by age, other languages have more issues and worse close ratios.


the point isn't that issues and PRs exist, it's that the maintainers don't seem to know what they're doing

this is self-evident from a cursory review

link me any 3 non-trivial merged PRs and i'm happy to point out the problem i'm describing


can you give any examples?


Sounds great, but why another language?


Because it works differently than existing languages?


May you plz ELI5 'High-level' ?

I'm just pointing out that there are several idioms, nomenclatures, slang, blah blah that when stated "high level" is condescending and making it that "i dont need to explain myself to you!!"

But yet, you DO have to explain yourself


Good:

- It uses "0o" instead of "0" prefix for octal numbers.

- Underscores to separate groups of digits in numeric literals.

Not good:

- It uses Unicode.

- It does not have a "goto" command.

- There are no macros.

- Perhaps, the name is too similarly than the other programming language, maybe?

This isn't everything that can be commented about it; I have not examined it completely yet.


I've never liked macros... What attracts you to that feature, I'm curious?


in most compiled languages you need something like them to get fully featured metaprogramming. It's not necessarily a case of liking them so much as they often become the best option. (note that C macros are a terrible implementation of macros, and yet despite that and the C++ committee hating them with a passion they are still often the only way to do certain things in C++, because the C++ committee has not sought to make better macros but attempt to extend other language features to accommodate some of their use cases.).


metaprogramming is not a virtue

programming languages do not themselves need to be programmable


My experience suggests otherwise. If the appropriate facilities don't exist in a language, users will resort to code generation.


usually, code generation is preferable to metaprogramming, mostly because it is easier to understand and maintain


Code generation is just external macros; it's the same thing in a worse form.

To maintain the code, you have to understand the input language to the code generator and its metaprogramming constructs. You're no better off in that regard.

The grandparent comment is saying that if you don't give people metaprogramming built-in, they will resort to outboard metaprogramming.

I.e. you can't stop people from metaprogramming.


code generators are programs written in an existing programming language, which produce target language source code as output

macros are programs written in a separate, unique, often turing-complete meta-language, which is implemented entirely in the compile phase of the language which supports them

they're in no way equivalent


That is why stuff like m4 gets born.


If done right, a macro system allows you to make your language modular and experiment with new language features without having to change the core language and the compiler. With the macro approach, languages become libraries.

The Racket people took this concept very far. The kernel of the language is very small and well defined. All Racket programs (or more precisely, expressions) are eventually reduced to a handful of syntactic core forms (see [1]). For example, thanks to forms such as (#%variable-reference id) you can specify rules for variable access, for example, w.r.t. life-time. With tools such as the Macro Stepper you can fully step through the transformations of any expression in your program, from the highest to the lowest level.

This has numerous benefits. Extensions or modifications of the language can be rolled out (and used!) as libraries. This makes collaboration and testing far easier. Also, if a language feature turns out to be a bad idea, you deprecate the library. You do not have to change the compiler. This allows you to shrink your language and explore different directions without the burden of an ever growing language spec and implementation.

Is it a perfect solution? No. Changing a widely used language always has big impact, but the impact can be compartmentalized and users of the language are given a graceful migration path by updating their libraries, at their choice and pace.

Is Racket perfect? No, not by a long shot. But, frankly, language authors should at least take a look at the possibilities and consider the technological options for controlling the evolution of a language.

https://docs.racket-lang.org/reference/syntax-model.html#%28...


Macros are very helpful for making many things. There are some problems with how macros are working in C, but it is also beneficial sometimes.

One thing that C does not have something like METAFONT's "vardef"; if you have "scoped macros" that can be declared as global or local variables and as members of structures, it would cause less interference than the current system and can even sometimes provide other benefits too.

Others include e.g. possibility for macros to define other macros (probably several other programming languages have such a thing), for appending to existing definitions (like {append} in Free Hero Mesh), for altering existing macros with a value calculated immediately instead of each time the macro is called (like {edit} in Free Hero Mesh), etc.


I guess you have never programmed in Lisp?


cf https://people.csail.mit.edu/gregs/ll1-discuss-archive-html/...

(this is PG's take, but IIRC the thread as a whole had some interesting ideas. Unfortunately Guy Steele's replies seem to have been under a different Subject: and haven't been threaded by the archive)

Edit: HN to the rescue: https://news.ycombinator.com/item?id=9695323


Not good: It uses Unicode??

What is not good about being able to use unicode? For me not being able to process unicode is an instant fail.


Not including Unicode would not make it unable to process Unicode, since you can implement whatever character encodings you want to do, without being stopped by the programming language's bad idea of character sets.

However, processing Unicode is often unnecessary anyways. Sometimes you only need to deal with ASCII (and can pass through non-ASCII unaffected). Sometimes the Unicode handling can lead to bugs (and sometimes can be security problems; and this is not usually due to Unicode being implemented incorrectly). Unicode can also make the code less efficient especially when Unicode is unnecessary but sometimes even if you do deal with Unicode (due to internal conversions and counting and other stuff it is doing when it is not helpful or might even be against what you are trying to do). Inherent Unicode handling can also sometimes make it difficult to handle byte strings especially if they do not have many of the same operations available to them.

It also tends to lead to bad design (API design and programming language core design). Sometimes it is used even though byte strings would be more appropriate, or sometimes you might want a separate "file path" string type (I think Common Lisp does this). Treating source files as Unicode text can also be probematic.

Unicode is not a very good character set anyways; it is bad in many ways. I could say what ways it can be bad for many different languages and for other purposes such as security and efficiency and accessibility, too. (Some people say it is better than dealing with other character encodings for multilingual text. I have worked with it and found the opposite to be true.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: