Could you share some best practices you've learned in generating animated product gifs? For instance, what software you use, what settings have you tuned, etc?
Generally, I'd say: keep it short and digestible. Also watch out for extra noise on the screen, otherwise it'll distract the viewer. And having an ending transition can be useful for understanding when the gif is done, since their looping nature can make this aspect confusing.
Software: Typically Screenflow. It has a built in Gif preset which I'll sometimes use, or if better compression is needed: https://gist.github.com/dergachev/4627207
"Which of the five options for GIF recording do you recommend" is still a valid question though, no? Why not save everyone the time of trying all five themselves?
Thanks for sharing, and for creating it! Great work. The tip on how to capture full sized screenshots without a browser extension[1] was already worth the visit.
I wish I could subscribe to mobile push notifications (without having to keep the site open or download an app) instead of subscribing to a mailing list. Does anything like this exist?
I've found over the years that emails as notifications get ignored by me, no matter how interested I was at the time of subscription.
Nice, but why don't you convert all those huge GIFs to HTML5 videos using e.g. Gfycat? Animated GIFs are loading very slowly and wasting my precious data plan.
Wow nice. I feel like a real noob I didn't even know about the command palette. It's too bad my friends on facebook aren't sharing these instead of all the damn recipe gifs.
From the same link, being able to take a full page screenshot (as in, below the fold) is also very excellent. I notice from the YouTube page description there is a further shortcut:
1. Open the Command Menu with Command+Shift+P (Mac) or Control+Shift+P (Windows, Linux, Chrome OS).
2. Start typing "Screenshots" and select "Capture full size screenshots".
I needed this literally yesterday, when I used MS Paint to cut and paste a screen together like a total mug.
Great! I was having trouble taking full size screenshots using UI last time I tried (apart from adding a custom device for desktop). This time I found "Laptop with HiDPI screen" as well which I believe was not there earlier.
I was stoked when I found out about it. Sadly, the fullsize screenshot makes my stylesheet switch into portrait mode and then I get the mobile look with the bottom half of the page completely missing.
All 3rd party vendors have to include code for many features, websites, browsers and devices. They will always contain unused code. GA does a lot and it's doubtful you're using much of it.
Pagespeed is a subpar tool that really shouldn't be used much. Follow the best practices and don't try to fix every possible little complaint.
As far as Google serving files though, its about as optimized as it gets. The JS file is gzip'd, served by Google's cloud network and on chrome it even uses their own QUIC protocol which is faster. It's also likely cached by the sheer amount of usage across the web so it'll probably be the fastest loading thing on your entire site.
It complains about the cache expiration time being too short. But there are good reasons they can't cache it longer. The short cache has no significant effect on page performance, as long as it is loaded async, but there is no way the performance tool can determine that unless it white listed it which would not be fair to someone else's tracking script.
Interesting tool, but even more interesting results. I just tried it on a simple, one-page website I built recently and there is not a single line of _code_ that's unused, yet it's still showing me 182 lines unused.
Things it seems to consider unused: `style` tags, if your CSS rule is on more than one line - the lines for the selector and closing tag.
There should be 0 unused lines since there are 0 unused rules, and the opening and closing `style` tags are DEFINITELY being used, so until these false results get weeded out it will be noisey to try to use this to track down real unused lines.
This is a fair point. The underlying instrumentation here is based on the matching of CSS rules to the DOM via selectors. We then map those rules back into source locations (including external style sheets, @sourceURL, and style tags). We do also have to account for all the other bytes in the block of CSS.
I believe in your case, these bytes are all whitespace. Here's a basic comparison of two pages where all rules are used: http://imgur.com/ZtfNFmo On the left, we have typical whitespace; On the right: no extra whitespace between style tags and rules. You can see we went from 15 unused bytes to 0.
To be extra clear about this we could somehow indicate these bytes are "unused whitespace" bytes. The implication is a little different in that case, mostly just to use a minifier.
So the line breaks and stuff are fine on the right? I could argue that _any_ line break in CSS is extraneous whitespace, so why are some line breaks OK but others not?
Chrome Dev tools, the first reason why I started using Chrome. I wonder if HN has any better alternatives to suggest? I'm curious to see what I could be missing on!
For a long time I've used firefox dev edition because their developer tools have been easier, particularly for editing and re-sending HTTP requests. (Right click "edit and resend" in firefox vs ? in chrome).
Also in firefox you can screenshot not just the whole page below the fold but screenshot any element by right-clicking the element in the elements pane.
It's not up to par with Chrome Dev Tools, but something to keep an eye on is https://github.com/devtools-html/debugger.html (a project run by Mozilla). This has the benefits of being open source and very accessible to developers (because of the technologies & being on GitHub).
Edge will probably get time travel debugging first, allowing you to rewind and inspect after something happens. Way better than adding debugger or console.log() everywhere.
My vague recollection of the Google event where this was first announced (was it late 2016 or early 2017?) - was it was going to "record" your site usage for as long as you were "recording" - and give the report at the end.
But this now sounds like a coverage tool for a single page?
Does anyone know if it can record over multiple pages and/or application usage (such as an SPA)?
Not that I don't appreciate the efforts that they put in to helping developers--and I'm sure this is for the single-page PWA crowd--but CSS and JS is far more complicated than looking at a single page (especially for big organisations).
If it's not for this page, then why are you loading it? Load it when the user goes to that other page.
That's the idea behind bundle splitting and whatnot. Don't send users code they don't need yet. Most of them are never going to get to the part of the page that does need that code.
Depending on what you are loading it may be more efficient to bundle everything in one file. If you are only loading a few kb per page, the overhead and extra time required for the network requests mean it's probably not worth splitting code.
At my day job we use app splitting rather than page splitting. Instead of breaking our codebase into individual pages and loading only those files, it's broken into apps. So you might load too much stuff for what you're currently doing, but you're never loading code that's completely unrelated.
A downside to that is that if you hit all our apps in one browsing session, you might download some vendor libraries like 10 times. But it's unlikely that a real user would ever do that.
We do something similar, but common code is shared. Webpack has a feature called chunking which we use to put everything from node_modules into it's own bundle which is loaded across multiple apps. Then the app specific code is loaded on each page.
In many cases it may not. Remember that you have to fetch data anyway when loading a new page. You could load the new parts of the CSS on the same connection.
Agreed. I often find the extra initial loading time is worth the improved user experience across the rest of the website once the bundles CSS file is cached.
Also you don't have the combinatorial explosion of making sure that the CSS for the new sidebar/footer/reusable element is available every time it appears in a page. It also simplifies debugging specificity issues a little (but shuffles the problem around a bit).
The Coverage panel in Chrome 59 is the debut, but it's gotten a lot of improvements since then. In Chrome Beta (or you can always look in Canary), the data is reported live as things happen, and indeed results do aggregate across different page navigations. Which is what you would expect.
Shouldn't you be minimizing your CSS (and JavaScript) as part of a deployment script, anyway? This tool supports sourcemaps, so it will show you the original CSS.
My major issue with it is that it doesn't seem to take media queries into consideration. It marked as unused a number of styles wrapped in a min-width that happened to be larger than my browser window. Without automation, it's going to be time-consuming to click through every page of a site and resize the browser window from very small to very large.
Yes. We actually implemented this later. Chrome Beta (or dev/Canary) filters them out by default, with a checkbox to opt-in. Screenshot: http://imgur.com/zSfShm1
Guess I will only be able to comment on these when I get home. The full screen screenshot feature is going to be a welcomed addition. I will especially have to teach it to the BA's since they always want to take screenshots to show to business when design is finished but test is still acting up.
This has been showing up on my work machine since our IT group started installing Chrome on all machines.
If you do a search for that and 'Chrome' you'll find sites that tell you how to work around the issue.
Chrome has long supported the ability to workaround non-admins the ability to install Chrome (local user directory I believe) so no surprise there's a workaround for this as well.
Personally, I found that the registry key wasn't matching what those say, but Chrome still has that message but will update. Your mileage may vary.
It's possible that a new version could have a breaking policy change. For example, all our intranet apps are served over HTTP. At some point, that's going to become untenable due to the changes for browser security requirements.
In general, there's no momentum to make a change that the agency doesn't "have" to make, and when the browser finally refuses to transmit forms over HTTP then we want to know about it before the change is deployed to all the users.
In the days of IE it was common because certain tools would only support certain versions of IE. Intranets and specialized web portals (like ticket systems) for example.
I like this, and it's addictive ;) Any way to automatically generate output that consists of the 100% essential code subset?
As suspected: a typical medium.com page contains approx 75% extra code. Most egregious offenders seem to be content loader scripts like embedly, fonts, unity, youtube, etc.
On the other hand, besides net load performance, I'm not really worrying about the "coverage" metric. Compiling unreal engine via emscripten to build tappy dodo may result in 80%+ unused code, but near native runtime worth is a healthy tradeoff.
HTTP/2 happened. HTTP/2 allows you to send HTTP requests over a single TCP connection per domain. Headers are compressed. The request overhead is minimal compared to HTTP/1.x.
Caching one big file has the draw back of busting the cache each time the file is updated for any small change.
Caching multiple small files allows you to have a finer grain cache. Only bust the things that updated. And, since it's all the same TCP connection it's now performant to load this way.
Sorry, wrote this before coffee. I was referring to H/2 eliminating head-of-line blocking. HTTP/2 gives you one TCP connection for multiple requests rather than the 7 connection limit browsers impose on HTTP 1.x
It'd be awesome if there was a button to download a file with the code that's used, and the code that's unused, instead of just having a diff. Hint hint :)
Excellent timing, I had given up finding a good tool for coverage on JS and CSS and where right now using audits in Chrome trying to find unused CSS and searching through the code to find unused JS on our landing page. Even if it is hard for at tool to find everything that is unused on a page it will show what is used so we know what we don't have to check in the code
I think it would be interesting if this works with source-maps, that are created from things like webpack or compilers/transpilers. The unused code which ends-up in the browser might result from unused code in original files which looks slightly differently, and it's interesting where it comes from.
Webpack bundles everything up into one huge .js file, I gotta try how it works with that.
Plus in my current environment (ReactJS + TypeScript + MobX + Webpack) tslint generates errors when I have unused variables and that breaks TeamCity build.
It contains around 150 tips which I display as short, animated gifs, so you don't have to read much text to learn how a particular feature works.