Hacker News new | past | comments | ask | show | jobs | submit login
Why your first Rust FizzBuzz implementation may not work (chrismorgan.info)
183 points by chrismorgan on Oct 12, 2014 | hide | past | favorite | 129 comments



Putting the String issue aside, I just wanted to show the beauty of pattern matching.

    for i in range(1i, 101) {
        match (i % 3, i % 5) {
            (0, 0) => println!("Fizzbuzz"),
            (0, _) => println!("Fizz"),
            (_, 0) => println!("Buzz"),
            _ => println!("{}", i),
        }
    }
-- edited: removed `.to_string()`, thanks chrismorgan


I haven't looked too deeply into Rust yet, but was able to understand this coming from Elixir. Pattern matching makes for a beautiful solution. This is a similar solution in Elixir:

  fizzbuzz = fn(x) ->
    case {rem(x, 3) == 0, rem(x, 5) == 0} do
      {true, false} -> IO.puts "fizz"
      {false, true} -> IO.puts "buzz"
      {true, true}  -> IO.puts "fizzbuzz"
      _             -> IO.puts x
    end
  end

  Enum.each Range.new(1, num), fizzbuzz
Since functions are also pattern matched in Elixir (and Erlang!) it could also be done without using case and handled purely as functions.


Yeah, it's also fun that you can do it in a multiple function head pattern matchy way

  defmodule FizzBuzz do
    def fizzbuzz(x),          do: fizzbuzz(x, {rem(x, 3), rem(x, 5)})
    def fizzbuzz(_x, {0, 0}), do: IO.puts "fizzbuzz"
    def fizzbuzz(_x, {0, _}), do: IO.puts "fizz"
    def fizzbuzz(_x, {_, 0}), do: IO.puts "buzz"
    def fizzbuzz(x,  {_, _}), do: IO.puts x
  end

  Enum.each Range.new(1, 100), &FizzBuzz.fizzbuzz/1

or in a more ruby-esque fashion

  (1..100) |> Enum.each fn(x) ->
    cond do
      rem(x, 3) == 0 and rem(x, 5) == 0 ->
        IO.puts "fizzbuzz"
      rem(x,5) == 0 ->
        IO.puts "buzz"
      rem(x,3) == 0 ->
        IO.puts "fizz"
      true ->
        IO.puts x
    end
  end


Python doesn't have pattern matching but the code is basically the same. I guess better cases for showing off the feature are ones where the patterns aren't just True/False tuples.

    for i in range (1, 101):
    	fbsign = (i % 3 == 0, i % 5 == 0)
	if fbsign == (1, 1): print("Fizzbuzz")
	elif fbsign == (1, 0): print("Fizz")
	elif fbsign == (0, 1): print("Buzz")
	else: print(i)


One difference is that this won't do exhaustiveness checks, where the pattern match will.


Well, here's an exhaustively checking version. Not statically still of course.

    for x in range(1,101):
        print [
                          # %3 = 0
        [      x,         "Fizz"  ],
        [   "Buzz",     "FizzBuzz"] # %5 = 0
        ][x%3 == 0][x%5 == 0]
(I'll note this was hard to get than if-checks and pattern matching so I won't be replacing such logic with matrices in my Python program any time soon...)


> not statically of course

Right, but that's the point!


Some people are sayint this is extremely non idiomatic python. I think most of the problem is not following the style guides. Here's a pep8 compliant solution that is a bit more idomatic, and almost as compact.

In Python, the way to do pattern matching is with dictionaries of functions.

    fizz_buzz = {(True, True): lambda x: "Fizzbuzz",
                 (True, False): lambda x: "Fizz",
                 (False, True): lambda x: "Buzz",
                 (False, False): lambda x: x}

    for i in range(1, 30):
        fbsign = (i % 3 == 0, i % 5 == 0)
        print(fizz_buzz[fbsign](i))


I see it as over thinking simple stuff. Maybe I think about performance too much but constructing a dictionary and then defining functions to do simple signature thing is imo just over designing things. Too much abstraction for expressing a simple concept. Performance aside, you need to go find the dictionary once you see code like that while with chain of if-else statements it's right there in front of your eyes.

As to 0 and 1 thing... Python has C'ish boolean (basically an integer type). Using True and False as numbers is completely standard as PEP 285 indicates. I tend to agree that maybe here using True/False is a bit more natural but it really is type-purity nitpick in my view.

Could you point out which part of PEP8 the code in my original post violates? While I don't really like a lot of currently popular style I see a lot of values in following the PEP's and long established conventions.


That's not genuinely pattern matching, however. It only works for patterns without member binding.


That is very non-idiomatic python. Use True and False instead of 1 and 0 when comparing the (boolean) result of a comparison.


I contemplated using `match` in the article, but decided that there was enough to think about and that it was long enough already. Thanks for pointing it out, though—it certainly is a great thing!


Weird to see this mix of a very imperative for-range iterative loop with a very functional pattern match, which makes it look similar to an SML or OCaml solution to FizzBuzz. I guess this is the definition of multi-paradigm right here.

Will Rust's type checker warn you of a non-exhaustive pattern match?


Rust was first conceived by an avid Ocamler, and it was originally implemented in Ocaml too. Although the pot has been stirred quite a bit since those early days, the influence still remains, including the expression heavy programming style, pattern matching, 'let's, HM inference, and `var: T` declaration syntax. Whilst Rust is quite procedural and (you rarely use recursion), it often feels quite functional due to those things.


Small note, technically Rust never did HM, and now we _certainly_ don't. It's still inference, just not that algorithm.


Oh - I thought it was a variant on HM, extended for region inference?


I'm not _totally_ sure, but I do know when I tried to reference HM in the docs I got yelled at... and now I'm pretty sure we do http://smallcultfollowing.com/babysteps/blog/2014/07/09/an-e...


It wasn't strictly HM, as it had extensions for the subtyping that lifetimes require. It was based on HM, however.

The new bespoke scheme gives approximately the same results as HM but is drastically simpler. For all I know this could inhibit Rust's future ability to do even more powerful things with types, but AIUI this scheme has the advantage of being actually decidable given the extensions to HM that we would require in the current language.

I ain't a type theorist though, so take this as hearsay. :)


Strictly speaking, I think almost no extant languages, and certainly no mainstream ones, use pure HM, but many take it as a starting point. Certainly, HM has no notion of ML modules, or type classes, or record types, or lifetimes. Nevertheless many languages using those things use HM as a starting point.

I'm curious (and a bit skeptical) of your claim that the scheme is "drastically simpler" than HM. HM is a beautifully simple design, which can be expressed (abstractly) in just a couple of lines.


I'm paraphrasing Niko Matsakis, Rust's type guru.

http://smallcultfollowing.com/babysteps/blog/2014/07/09/an-e...

"This scheme simplifies the code of the type inferencer dramatically and (I think) helps to meet our intutions (as I will explain). It is however somewhat less flexible than the existing inference scheme, though all of rustc and all the libraries compile without any changes."


Here's the same approach in F# without the different types of String; therefore, easier to get more functional.

    let fizzbuzz num =     
       match num % 3, num % 5 with      
          | 0,0 -> "FizzBuzz"
          | 0,_ -> "Fizz"
          | _,0 -> "Buzz"
          | _,_ -> num.ToString()

    [1..100]
      |> List.map fizzbuzz
      |> List.iter (fun (s:string) -> printfn "%s" s)


And racket, just for kicks

    #lang racket
    
    (define (fizz-buzz n)
      (match (list (modulo n 3) (modulo n 5))  
          [(list 0 0) "FizzBuzz"]
          [(list 0 _) "Fizz"]
          [(list _ 0) "Buzz"]
          [_          n]))
    
    (for [(i (range 100))]
      (displayln (fizz-buzz i)))


Haskell -

  fizzbuzz x = case (x `mod` 3, x `mod` 5) of
      (0, 0) -> "FizzBuzz"
      (0, _) -> "Fizz"
      (_, 0) -> "Buzz"
       _     -> show i

  mapM_ (putStrLn . fizzbuzz) [1..100]


Pattern matching is one of the ways to get "two-mod" code where the modulus operator is used two times. For example, string concatenation and assignment operators do the same in this Python code:

    for i in range(1, 101):
        x = ""
        if i % 3 == 0:
            x += "Fizz"
        if i % 5 == 0:
            x += "Buzz"
        if x == "":
            x += str(i)
        print(x)
If anybody is randomly curious it can be fun to solve FizzBuzz in Haskell the same way that this Python code does it, but (to be more idiomatically Haskell) storing the FizzBuzz success in a Maybe (or some other variable). If you define the appropriate `~>`, and higher-precedence `|~`, and higher-precedence `|~~>`, you can write the above function as:

    fizzbuzz x = 
        mod x 3 == 0    ~> "Fizz" 
        |~ mod x 5 == 0 ~> "Buzz" 
        |~~> show x 
It's interesting because it's sort of a "follow-through guards" situation; the (|~) operator can at least be turned into a type signature of (Monoid m) => Maybe m -> Maybe m -> Maybe m.


Currying often allows for elegant point free code. Like in your last line, for F# they could also have written

    [1..100] |> List.iter (fizzbuzz >> printfn "%s")


Nice F# snippet! This just looks so simple and elegant. As another approach, the "enum" types the OP mentions map to discriminated unions in F#.

FWIW your final wildcard match can be (modestly) simplified to just

  | _ -> num.ToString()
and the last line can be distilled to just

  |> List.iter (printfn "%s")


Although I love pattern matching, I find the solution with 'if' more legible and no less functional.


Non-exhaustive pattern matching is a compilation error.


> Will Rust's type checker warn you of a non-exhaustive pattern match?

It refuses to compile entirely.


I guess beauty is in the eye of the programmer. I'd choose Python's or Ruby's FizzBuzz. It's beautiful that everyone can immediately understand those. This one, not so much. As a little experiment, I've deliberately avoided learning Rust to see if I can understand its idioms without reading any docs. I can sort of guess at what's going on here by reverse engineering what should happen with FizzBuzz, but it's not at all intuitive. For example, as an outsider, I'd expect it to be (0, 1) instead of (0, 0) since it's matching both the 0th and 1st patterns. Whereas (0, _) would be "0th pattern but not the 1st," or something, even though that really wouldn't make much sense because "0" would refer to which pattern it's matching, rather than the position of the argument determining which pattern it's matching. Etc.

If Rust is the most robust way to solve a problem, it should naturally catch on. It seems pretty promising in that regard.

EDIT: As a counter to my comment, my argument would be equally applicable to Lisp, and Lisp is beautiful. So my argument is probably mistaken.

Maybe someone has to learn a language before judging whether it's beautiful.


You can write Ruby in this fashion as well.

  def fizzbuzz x
    case [x % 3 == 0, x % 5 == 0]
      when [true, false] then puts "fizz"
      when [false, true] then puts "buzz"
      when [true, true] then puts "fizzbuzz"
      else puts x
    end
  end
The one and only time I was asked FizzBuzz in an interview, I wrote it this way, so it's not all that contrived (in my opinion, anyway!)


It must just be a matter of familiarity. Rust's pattern matching is quite similar to other languages with pattern matching.

For example in OCaml the match statement would be:

     match ( i mod 3, i mod 5) with 
      | (0,0) -> Printf.printf "Fizzbuzz"
      | (0,_) -> Printf.printf "Fizz"
      | (_,0) -> Printf.printf "buzz"
      | (_,_) -> Printf.printf "%d" i
Other than the irrelevant syntax bits like 'with', the semantics are identical, in order matching with no fall through, and _ for unnamed and unused bindings.

To me with very limited experience in it, Rust really feels like OCaml with a skin that C programmers will understand.


> As a little experiment, I've deliberately avoided learning Rust to see if I can understand its idioms without reading any docs.

:D

If you have the time, as you do this, I'd love to hear about your experience. Email me any time.

(I maintain Rust's docs, and am also starting to write some introductory curriculum. Hearing from people like you is _invaluable_.)


Oh, okay! Cool! I didn't realize it'd be valuable to anyone. I'll try to put together something for you, and I'll take it seriously so that it isn't biased one way or the other. I have some stuff coming up, but after seeing some incredibly neat stuff written in Rust, I'm planning on doing a project myself, and I'll email you with a raw braindump of my first experiences with the language, along with a list of previous languages I've learned as well as my experience level with each. Thank you for maintaining Rust's docs!


Great, thanks! (And Mozilla is owed some thanks for paying me, otherwise I wouldnt have nearly as much time to do it)


It’s whether (i % 3, i % 5) is equal to (0, 0) et al., where _ means “any value”.


To be fair, you can bind to any name. So, binding to `a` instead of `_` would work as well. You could then use that bound value in the corresponding expression.

However, I would have expected rustc to complain about unused variables in

    fn main() {
        for i in range(1i, 101) {
            match (i % 3, i % 5) {
                (0, 0) => println!("Fizzbuzz"),
                (0, a) => println!("Fizz"),
                (b, 0) => println!("Buzz"),
                c => println!("{}", i),
            }
        }
    }
but neither the playpen nor yesterdays snapshot complains. And if you want to suppress warnings about unused variables, you prefix the variable with an underscore, or just use only the underscore, which got common to mean "I don't care what value gets bound to this name.".

    fn main() { let a = 0u; }
compiles with warning: unused variable: `a`, but

    fn main() { let _a = 0u; }
compiles silently.


Thanks for noticing that! I filed https://github.com/rust-lang/rust/issues/17999


That's a very useful feature. Maybe I'll go ahead and learn Rust now. If it has a features like pattern matching, which seems about ten times more useful than the classic switch statement, then it probably has a lot of other insights worth learning.

If you were to start a hypothetical project written in Rust, what would it be? I'm looking for something to cut my teeth on.


I would suggest you port over a project that you are already familiar with. It's easier to learn a new syntax when you don't have to grapple with implementation as well. And you get to have an objective comparison of the same project implemented 2 different ways.


I agree with this, but I will say that sometimes, you end up structuring a program differently due to the language. This happens a lot in Rust.

It's still easier when you've solved the problem previously, however.


> As a little experiment, I've deliberately avoided learning Rust to see if I can understand its idioms without reading any docs.

A result of applying this approach to languages in general would be only knowing a couple of fairly similar languages. Which is bad. Really, really bad. Language influences our way of thinking in a non-trivial way, knowing only one kind of language is limiting.

Even worse, you're going to forever stay constrained to one family of languages that you didn't (probably) even chose yourself. In a current world it's ok if you were introduced to a C-like language as your first, but what if it was Pascal or Scheme?

In short: DON'T DO THIS. Learn more languages, the more FOREIGN (ie. you can't understand anything without docs) the BETTER. The 'intuitive' languages only let you express the same solution again and again, while breaking AWAY from your intuitions and learning 'non-intuitive' languages let's you see and implement DIFFERENT solutions.


It can be that _ for any is less expected than a star or a dot would be. We're used to the star in shells and the dot in regexps for decades already.

(0, *) seems more obvious in that example than (0, _) but at least we are used to ____ in the forms printed on paper too.


Just as a point of interest: in these cases, (0, ..) would also work, where .. means “any number of elements”.


`..` doesn't work with tuples (yet?).


> As a little experiment, I've deliberately avoided learning Rust to see if I can understand its idioms without reading any docs.

As an experiment of what? Whether rust code makes somewhat sense to you depends to an extent on what languages you already know (I guess ML languages would help). Same as with Ruby and Python.


I fail to see how a simple switch statement doesn't read just as easily. I mean, sure I have to know the mod 15 trick, but... not exactly hard.

    function fizzBuzz(i) { 
      switch(i % 15) { 
        case 0: return "fizbuzz";
        case 5: 
        case 10: return "buzz";
        case 3:
        case 6:
        case 9:
        case 12: return "fizz";
        default: return i.toString();
      }
    }
    [1,2,3,4,5,6,7,8,9,10,11,12,13,14,15].map(fizzBuzz)


Even that is better in Rust (using the enum from the end):

    match i % 15 {
        0 => FizzBuzz,
        5 | 10 => Buzz,
        3 | 6 | 9 | 12 => Fizz,
        _ => Number(i),
    }


I actually agree it is better. But... my point was that the switch statement was already pretty readable. If there are gains, they feel pretty small in these examples.

And to be clear, I like pattern matching. A lot. I just don't feel this really shows it off that well.


Only because you're expecting the cases to fall through, you're being blinded by your own expectation.

Your code is considered bad practice in many languages.


What do you expect; it's FizzBuzz.


For what it's worth, String in Rust is similar to StringBuffer in other languages. You can append to a String; you can't append to a slice, which always represents a fixed view.

A slice has storage that is borrowed from somewhere else, but it itself does not have its own storage.

When you type `"foo"`, you are creating "static" storage (in the binary) and the slice is borrowed from that fixed-position location in memory.

Mostly, your functions produce Strings and consume slices.


But shouldn't they (String and slice) have some common super type to make this all easy to use?


I don't think a common ancestor is how you want to do this. Instead, I'd encapsulate the common behavior in a trait (a.k.a. an interface, in other languages), and then write functions that accept any parameters that implement that trait. In Rust, you can do this even on "built-in" types like strings. For an example, see the `print_me` function below, which operates on both string types using a trait that I've defined myself.

  trait WhatIsThis {
      fn what_is_this(self);
  }

  impl WhatIsThis for String {
      fn what_is_this(self) {
          println!("'{:s}' is a string!", self);
      }
  }

  impl WhatIsThis for &'static str {
      fn what_is_this(self) {
          println!("'{:s}' is a string slice!", self);
      }
  }

  fn print_me<T>(me: T) where T: WhatIsThis {
      me.what_is_this();
  }

  fn main() {
      print_me("Blah blah blah");
      print_me("Yada yada yada".to_string());
  }


And I expect virtually all of the methods on slice to be available on String before long. I think it's a bug that this isn't the case today.

It's already possible to have a function take "either a String or slice" generically:

    def print_me<T>(me: T) where T: Str {
        println!("I am '{:s}'", me.as_slice());
    }
The overall ergonomics of this (or at least, the well-documented idioms) will certainly improve in the coming months.



Yes. It's a design bug that String and slice don't share a common trait and that `.as_slice()` is common simply to use slice methods.

I expect that to be fixed before 1.0.


At least you can now do: foo[].some_slice_method()


I think this is an anti-pattern. I'm not a fan of the slicing syntax in general, which seems to exist only to paper over the missing traits that wycats mentions.


We will almost certainly just have `Deref<str> for String` and so autoderef will handle `some_string.some_slice_method()` correctly.


Will we be able to kill the slicing syntax, then? :)


Slicing syntax is about being able to do stuff like string[1:3] AFAIK, so I hope it won't get killed.


Also, the location of storage is slightly more visible (in general) in Rust than in other languages.

I have personally found this to be pretty clarifying, because as much a we may like to abstract over it, the location of storage often worms its way into the programming model even in HLLs.


Indeed. I feel like learning Rust has helped me grok C and Java much more, because the different forms of allocation is much are visible.


And that indeed is the real distinction between what Rust has and what other languages tend to have—the fact that &str doesn’t have its own storage. The equivalent to string types in other languages would be more like SendStr.

That functions produce Strings and consume slices is a good way of expressing it.


That was informative... and it reinforced my perception of Rust as something to look into if I ever have to do something that absolutely necessitates the use of something low level like C++/Assembly. But for everything else that I can get away with (and that is a lot so far) I'll stick with Go because it's so much faster and shorter to write.


I think it's, in general, useful to differentiate the speed of code-writing as a new developer and the speed of code-writing as an experienced developer.

The reality is that people have very little time to try out new languages, and have to rely on anecdotes about the long-term cognitive costs of things like this.

My personal experience is that many of the seemingly more-onerous things about Rust end up falling away once the rhythm of programming sets in.

This is likely similar to how the error-handling approach of Go looks onerous at first, but seems to be something that doesn't slow people down too much in practice.


> it's so much faster and shorter to write

This is not obviously true to me, Rust provides more tools for building abstractions, even just "handle errors less manually"-abstractions, so they're possibly not that different (this certainly applies to the 'shorter' section).

I guess it may be true for the things Go is suited/designed for; time and experience will tell.


Anecdotes and personal opinions follow.

I'm pretty happy with the speed at which I write Rust code, but I can definitely churn out Go code more quickly. (Probably on the order of how quickly I can write Python, although refactoring Go code is much faster.) I'm not sure exactly why, but my guess is that there are fewer abstractions to deal with (and fewer opportunities to make abstractions). I have written several medium Go applications (near or above 10 KLOC, which ideally, I would never hit in a dynamic language), and I'm pretty happy with how the code turned out in all but one of them. (But that one is a window manager.) I haven't yet written a similarly sized Rust application, though.

I do write Rust code more quickly that I write Haskell code though. :-)


Yeah, that's why I focused on 'shorter' more than 'quicker'. (And I wasn't saying it is not true, just pointing out that it's cut and dried.)


Whoops! I mean: that it's not cut and dried.


Bear in mind that Rust's APIs still have a lot of work remaining in their ergonomics, and this is likely what slows down the daily grind the most.


If something absolutely necessitates using C++, why not use C++? (I'm genuinely asking.)



Because writing and compiling code/projects in it is very painful. Header files, custom makefiles, etc.


Pardon me, but 'custom makefiles' have absolutely nothing to do with C++. There are IDEs with C++ support.

Also, while working with headerless languages may be easier, calling working with headers 'very painful' is hyperbole.


I was referring to "development using C++", which encompasses the language, compilers, build tools, code editors, ecosystem, basically everything that is involved when you're doing work and distributing result binaries.

There are IDEs that have their proprietary project formats that are incompatible.

Having header files makes refactoring by hand much more difficult than necessary, yet refactoring tools for C++ are mostly not possible.

Finally, https://gist.github.com/shurcooL/86949a392dcdac1f94cf.


Irregardless of IDEs, I'd still need to learn how to use cmake and other systems to build and use libraries. It's a large pain compared to `pip install ...`.


...need to learn how to use cmake and other systems to build and use libraries.

On OSes for which this is true you'd be forced to compile python yourself as well, because it's also just a dependency (of pip, for one!) written in a compiled language.

Edit: My point is: usually it's as easy as 'yum install', but on the rare occasion you will need to compile, I admit. However, that could happen with pip too; don't tell me its repos are always completely up to date. And in those cases it won't be quite as simple as 'pip install'.


Here's what the final example from the blog would look like, if you were to directly translate it into C++: http://coliru.stacked-crooked.com/a/265c6ec6fb6751a9

Now, of course, no one would program that way, but I think it does help visualize what really happens.

The obvious cost from delaying the printing is that you have to branch a second time, later in the code, to consume the value. I wonder how feasible it would be to introduce some kind of compiler transform that could invert the control flow, essentially pasting the surrounding code into the inner branches, to make this abstraction cost-free.


"may not compile" would be a better title than "may not work". To a programmer familiar with compiled languages, "may not work" implies that the code will compile but produce the wrong result/a crash/undefined behavior. Rust aims to catch mistakes at compile time, so the headline is quite sensational under this interpretation.


> There is a trade‐off here for them; as a general rule, such languages have immutable strings

This is a bit disingenuous. I understand what he's aiming at, and he also mentions StringBuilder later on, but saying that a GC necessatites immutable strings is simply not true.

As a counterexample: PHP has mutable strings and uses copy-on-write in situations where it "feels" that conflicts could occur. (Granted PHPs rules on how it handles its variables is a bit arbitrary and magical, and PHP didn't have a GC till version 5.3 .. but the argument still stands.)


Yes, I know I was being rather fuzzy there. I was intending to give a general impression of where the differences lie.


    for i in range(1, 101):
        print('FizzBuzz' if i % 15 == 0 else 'Buzz' if i % 5 == 0 else 'Fizz' if i % 3 == 0 else i)
python doesn't have expressions? oh dear me.


I think that "{true} if {cond} else {false}" is quite an unnatural and confusing construct, especially when you attempt to nest them. Although I'm not really familiar with Python I thought that was concatenating 'FizzBuzz' with the value of some nested ternary expression and only realised the order was inverted when I tried to parse the inner one.

The vast majority of conditionals in languages I know follow the {cond} {true} {false} order: IIf({cond}, {true}, {false}) in VB, SQL, and spreadsheets; if({cond}) {true} else {false} and ternary {cond}?{true}:{false} in C and C-derived languages; (if {cond} {true} {false}) in the Lisp family; if {cond} then {true} else {false} in ALGOL/Pascal, etc. There's probably a reason for this order, as seeing a condition in the middle of an expression feels surprising and unexpected.


Python does the same, but it has a 1-line version of conditionals that is what the parent uses. Many Python programmers enjoy 1-liners, but I think once you start adding else statements to them they become unreadable.


The OP's characterization is reasonably accurate in my experience. I've run into several Python programmers who didn't actually know about Python's special "if expression" syntax.

Moreover, the code you've presented here certainly isn't idiomatic, which counts for something.


sure, but when you're implicitly comparing code segments (by placing them next to each other), you should at least make the effort to make them more the same, instead of pointing out that one language is missing a feature used in the other language, especially when this claim is false.

the formatting can of course be improved:

    for i in range(1, 101):
        print('FizzBuzz' if i % 15 == 0 else
              'Buzz' if i % 5 == 0 else
              'Fizz' if i % 3 == 0 else
              i)
there's also a suspicious "return" at the end of the second code segment which mysteriously appeared some time after the first one; looks like the author was trying a little too hard to differentiate python and rust.


I disagree. I think code comparisons should be done using idiomatic code. I personally would not consider chaining `if` expressions like you've done here idiomatic Python.


and yet people write:

    let result = if i % 15 == 0 {
        "FizzBuzz"
    } else if i % 5 == 0 {
        "Buzz"
    } else if i % 3 == 0 {
        "Fizz"
    } else {
        i
    };
in rust? either this is good, readable code or this is poorly written, unintelligible code. you cannot make the argument that sometimes it is readable and sometimes not based on the presence of braces.


The comparison is not "readable", it is "idiomatic". (I'm not saying either is unreadable, just pointing out that you're attacking the wrong thing.)


Using `if .. else if .. else` in Rust as an expression is absolutely idiomatic.


I don't see a problem with code chosen for each example. What is a problem is explicitly stating that python doesn't have such functionality.


Yes. I responded to this initially above.


yeah, no one writes python like this.


I've seen it quite frequently and kindof like it because it doesn't introduce any state that could leak out or get mutated from somewhere else. Although the ternary operator doesn't make as much sense in python as in other languages since there is no const keyword, otherwise that's what the ternary operator is usually used for.


I've only ever seen it non-nested.


I’ve certainly written `a if b else c if d else e` before, and it reads perfectly naturally—but you do want to be careful doing such things. They’re very easy to overuse.


[deleted]


I deliberately didn’t go about omitting the number 15 or the string FizzBuzz, because that would have distracted from the key points I was making about Rust. It is not possible to make it as efficient under those constraints—you end up needing either more than one print call, or to use an owned string, where I was able to end up with a solution that didn’t require any heap memory at all.


yeah. I'm being too nitpicky. I guess part of the "beauty" is that 3*5==15 but I realized that you'd need more complexity after hitting the reply button.


Scala guy here. This kind of thing is useful - I'm happy to pay the costs of garbage collection so I don't need it for ownership, but for separating out async operations from sync operations, or database-transactional operations from non-database operations, it's great to be able to represent that difference in the type system (and without a huge syntactic overhead). But then if you want to abstract over types like MaybeOwned, you need higher-kinded types to be able to work with them effectively. Has Rust got any further towards implementing those?


Unfortunately I don't think higher-kinded types are planned for Rust 1.0. See the rfc: https://github.com/rust-lang/rfcs/issues/324 and this HN thread: https://news.ycombinator.com/item?id=7997926


They plan an aggressive release schedule, though:

http://blog.rust-lang.org/2014/09/15/Rust-1.0.html#release-p...

So they could come in one of the post 1.0 releases and that could possibly be soon.


I've always liked this c++ implementation of FizzBuzz, its not the most clear or logical but its short;

    const char* outs[] = { "%d\n", "Fizz\n", "Buzz\n", "FizzBuzz\n" };
    for (int i = 1; i < 101; i++)
        printf(outs[((((i%5)==0)<<1)+((i%3)==0))], i);


Is that allowed by the standard? (Passing a parameter to `printf` but not referencing it, in the non-"%d\n" case?)


C11 §7.21.6.1

  If the format is exhausted while arguments remain, the excess
  arguments are evaluated (as always) but are otherwise ignored.
but that's just a consequence of stdarg, which does not require all supplied arguments to be consumed.


While FizzBuzz is a low-pass filter in most languages, with Rust it makes for a fascinating window into the language design!


... and therefore a viable interview question :)


The meandering run-on garden path sentences make this post very hard to follow at times.


Yeah, I’m afraid I do have a bit of a tendency to letting that happen.


In the contrived Python example, FizzBuzzItem is only called if the result is a number (not in any of the modulo 0 cases) - is that intended? I can see that it works, but it's breaking the analogy for me with the Rust code.


Whoa, that was indeed a mistake. Sorry about that. Fixed.


Thanks, also is the "try it" link on rust playpen linking to the wrong code?


No, that’s the code that is supposed to be there. It’s indicating that there is nothing that you can put as the lifetime there.


You know Rust team, it almost might be worth specializing this exact error message about mismatched string lifetimes to include a URL to this post, if not mismatched lifetimes in general.

You know it's going to be a FAQ....


We have unique diagnostic codes for each error, and I have plans (and an in-progress PR) that points to a web page with a much longer "this is what this error looks like, here are some strategies with how to fix it" in the works.


The feature list in rust really does have my eye. The biggest one in particular was type inference.

The reason type inference was such a big one was because if you use it right, annoying situations like "Two types of strings? What is this?" go the hell away. You have three types, static built in and binary strings, and a third that only makes the gaurentee that the datatype can do all the things a string aught to be able to do, and from there the compiler works out the practical implementations.

This article has done a great job in killing my enthusiasm for the language.

I guess it's implementation of type inference only goes as far as golangs, in that const keyword.

Maybe I was being a bit naive in what I was expecting, hell maybe what I'm expecting isn't reasonably possible. bleh.


Type inference doesn't exactly paper over the differences between types automatically. It just infers types, and doesn't complain as long as all the types line up.

Consider doing something similar in Haskell, setting a variable to either be a string or Text:

    GHCi, version 7.4.1: http://www.haskell.org/ghc/  :? for help
    Prelude> import qualified Data.Text as T
    Prelude T> let x = (if True then "foo" else T.empty)

    <interactive>:3:32:
        Couldn't match expected type `[Char]' with actual type `T.Text'
        In the expression: T.empty
        In the expression: (if True then "foo" else T.empty)
        In an equation for `x': x = (if True then "foo" else T.empty)
Sure, Haskell can sometimes auto-infer very complex types, and has more extensive type inference than Rust does. But it's not magic, and will not do everything for you.

What you're asking for is not type inference, but something else. Perhaps what you really want is weak typing (automatic type conversions), or message sending (can not statically dispatch).


that's a good point I suppose.


Rust has type inference similar to Haskell: type information can flow "backwards". It is very different to Go and C++ where types of locals are 'inferred' from their initialiser, and nothing else.

E.g.

  fn main() {
      let mut v;

      if true {
          v = vec![];
          v.push("foo");
      }
  }
is a valid Rust program: the compiler can infer that `v` must have type `Vec<&str>` based on how it is used. I don't think it's possible to syntactically write a Go program at all similar to this (a variable has to be either initialised or have a type), and the C++ equivalent (using `auto v;`) is rejected, as `auto` variables need an initialiser.


The C++ analogous, although not exactly the same, is to use a `make_vector` wrapper, like

  template<typename...Args>
  inline auto make_vector(Args&&...args) {
    using T = typename std::common_type<Args...>::type;
    return std::vector<T>{{std::forward<Args>(args)...}};
  }
  ...
  auto v = make_vector("asd", "dsa", std::string("asdsa"));
It will obviously not deduce types after the vector is declared, but it's as close as one gets to type deduction based on the vector's content.

There is one instance in C++ where information does flow backwards in a sense: disambiguating template overloads. For example,

  using fn_type = std::vector<int>(&)(int&&,int&&,int&&);
  auto v = static_cast<fn_type>(make_vector)(1, 2, 3);
In this case, the static_cast information flows "back" to the type deduction of `make_vector` to deduce what Args&& is. This is not very useful, just a curiosity.


Doesn't type inference stop at function boundaries, though? I'll grant you that idiomatic Haskell uses type annotations for function signatures (unlike idiomatic OCaml), but it is optional (which is convenient in a REPL).


Yes, it does, but that's a deliberate design decision, not a flaw. We decided enforcing this idiom was a good idea.


I think it was related to the quality of error messages as well.


Right. When you infer signatures, a change in one place can cause an error somewhere else, and so the error message is very misleading.


The type inference does mean that you can care less about what specific type you’re working with, and that it is rare that you will need to write types out (except in signatures—the type inference is deliberately only local), but the distinctions are certainly still there, and due to the nature of the language must be.

Really, this is showing one of the more tricky parts of Rust, potentially to balance the claims of excessive bullishness for Rust that I have heard levelled at me! Don’t let it dim your enthusiasm too far; Rust is still very much worth while trying out in practice.


What's up with the "alternative" form of Python? What about braces, semicolons, and an explicit main() function makes that version worth showing?


Well, "from __future__ import braces" is a joke, so applying the transitive property of jokes, I assume this alternative version is a joke as well.


Indeed it is. And people should certainly try executing `from __future__ import braces` if they’re not familiar with what it is.

As for the main function and `if __name__ == '__main__': main()`, that part is actually generally recommended.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: