What the hell. A developer who writes x << 3 instead of x * 8 when writing code that is logically doing multiplication is not a good developer. Written code is designed for human readability; the human should not be having to convert that mentally to its math equivalent to understand what operations are taking place. In nearly every possible case, this is at best a micro-optimization. At worst, a symptom of someone who writes unnecessarily obfuscated code to show off their knowledge in a manner that only hurts the maintainability of code by a team.
If I found bit shifting in an interview code sample as a replacement for basic multiplication, I would ask the developer why they chose to do it that way. They'll either a) calmly explain that their computer science classes taught it that way or it's a habit they've adopted after writing code for embedded systems or similar where the optimization actually made a difference, or b) their ego will make an appearance with a "because I'm so senior" attitude. The latter is not a good sign.
If I found bit shifting in an interview code sample as a replacement for basic multiplication, I would ask the developer why they chose to do it that way. They'll either a) calmly explain that their computer science classes taught it that way or it's a habit they've adopted after writing code for embedded systems or similar where the optimization actually made a difference, or b) their ego will make an appearance with a "because I'm so senior" attitude. The latter is not a good sign.