back to article Google and pals launch Accelerated Mobile Pages project

Alphabet's ads, search and cloud subsidiary Google has announced an open source project it hopes will speed web page load times on mobile devices: Accelerated Mobile Pages. The Accelerated Mobile Pages (AMP) project is driven by Google, with help from major publishers and the BBC. Early adopters include Twitter, Pinterest, …

Page:

  1. msknight

    Dumb, dumb, DUMB.

    If they want to go after something important, how about slapping web developers on the wrist, who fail to optimise their graphics. THAT is what kills web pages for me on mobile.

    When I designed pages in the 90's, it was essential that we optimised graphics for the 56k modem. Now it's important for a morphed version of the very same reason.

    Going after device performance improvements when processing power and RAM on mobiles is increasing all the time, just seems ludicrous to me. Javascript allows a lot of flexibility regarding menu structures and is relatively easy for a noob to pull off. There is the potential here that they could lock people out of web design if they're not careful ... by including something that is obvious to them, but won't be so obvious to someone coming in fresh.

    1. msknight

      Re: Dumb, dumb, DUMB.

      And there's something else here. Design and programming are two different skills. I'm better at the latter than the former. But right now, designers can get a foot in the door of coding enough to make their designs happen.

      If techies re-define stuff in a way that closes that door, then we're going to lose some great designers, because they can't get to grips with the coding necessary. And don't anyone dare say WYSIWYG.

      They've got to be very damn careful about what they do here. There's a lot of positive potential, but also a massive potential to muck it all up.

    2. Alister

      Re: Dumb, dumb, DUMB.

      I agree with you about web optimisation of graphics, however, I don't agree with this sentence:

      Javascript allows a lot of flexibility regarding menu structures and is relatively easy for a noob to pull off. There is the potential here that they could lock people out of web design if they're not careful

      To suggest that removing Javascript would lock people out of web design is nonsense, in my view. Like any programming language, If you want to code web sites, you should learn how to do it correctly, not just take easy shortcuts.

      There are a lot of badly written websites out there because cutting and pasting chunks of Javascript is easy, instead of taking the time to design and build the functionality properly.

      1. msknight

        Re: Dumb, dumb, DUMB.

        My point is that learning to do the coding properly will be difficult for many of the designers. Doing both good design and good coding is not easy. I can code, but my design skills are the pits.

        To my mind, it's like asking painters to become experts in manufacturing their paints. We're going to lose more than we gain, IMHO.

        1. Dan 55 Silver badge

          Re: Dumb, dumb, DUMB.

          Perhaps first we could think about getting rid of jquery et al, 75 external tracking scripts for ad networks hanging off a web page, and huge bandwidth-eating images (looking at you Reg) before we're forced to go back in time and suffer WAP again.

          Less is more.

          1. LucreLout

            Re: Dumb, dumb, DUMB.

            @Dan55

            Perhaps first we could think about getting rid of jquery et al

            I agree. However, one of my more cowboy colleagues who labours under the misaprehension that 7 years experience is experienced enough not to listen to those with multiples of that, has determined that all sites should be made responsive and slick via jQuery. There's literally no talking to him about it, or why JavaScript is one of the biggest security risks an enterprise can permit.

            With the advent of node.js things have gotten worse still. Now he just wants to write html + js front ends for everything, whether it belongs in a browser or not, and have that talk to a js back end. There's literally no understanding of secure computing, EVAL() risks, threading issues, or maintainability. TDD is that other bloke who had an excellent adventure with Bill, apparently.

            1. dogged

              Re: Dumb, dumb, DUMB.

              @LucreLout - while I have some sympathy, TDD with Javascript is more than possible. Take a look at QUnit (and the headless phantom.js browser).

              Even easier and more sensible with Typescript.

          2. Blitheringeejit
            FAIL

            Re: Dumb, dumb, DUMB - @ Kubla Cant

            AJAX is a part of the problem, not the solution - at least in the way that some use it. Textboxes which execute an XMLHTTPREQUEST on every single bloody keypress are exactly what makes browsers feel slow and jerky.

        2. Guus Leeuw

          Re: Dumb, dumb, DUMB.

          Dear Sir,

          Isn't this what the new breed of techsavvy kids are supposed to fix? They should know all about software programming from primary school, and if that fails, they can later go on a 18 month course that teaches them all about coding...

          Ok ok ... I'm removing the tongue from the cheek.

          Regards,

          Guus

        3. Grikath

          Re: Dumb, dumb, DUMB.

          "To my mind, it's like asking painters to become experts in manufacturing their paints."

          Amazingly, artistic painters see "Knowing How To Manufacture and Mix Pigments Into Paint" as an essential skill....

      2. kmac499

        Re: Dumb, dumb, DUMB.

        I hope this approach will kill 'Responsive Design' sites that fill my screens with oversized meaningless images and video previews; whilst stopping me from choosing a render that I find comfortable to read. Are you listening BBC...??? I thought not.

    3. Kubla Cant

      Re: Dumb, dumb, DUMB.

      1999 called - they want their web pages back.

      A major use of JavaScript these days is to build single-page applications (SPAs), where the markup remains substantially unchanged while data is sent back and forth using AJAX. Removing JavaScript will presumably mean reverting to the slow round-trips and page refreshes of yesteryear.

    4. Anonymous Coward
      FAIL

      Re: Dumb, dumb, DUMB.

      Ummmm, let's not forget production code with comments left in (Security Risk), un-min'd JS ..... I've said it before, and again .... bandwidth is cheap, except where it isn't. With today's data plans, who needs 5MB of JS because LAZY FUCKING DEVs are "that" lazy that they cannot clean up their code.

      I've seen MBs, and MBs and MBs of un-optimized code. Devs, get your shit together!!!!!

  2. Winkypop Silver badge
    Coat

    Janky

    Nephew of Jar Jar, new Star Wars film?

    No?

    --> Puts on spacesuit and steps outside..

    1. Notas Badoff
      Joke

      Re: Janky

      Actually you're quite close likening this to jAr! jAr!, to wit:

      1. (slang) of poor quality, odd

      2. (jargon, computing, rare) Unresponsive (of a software application’s user interface), sluggish.

      Me, I'm waiting for El Reg to define what a "Web Vomponent" is. Maybe that's from another StarWars movie?

      "I used to bullseye vomponent rats in my T-16 back home. They're not much bigger than two meters."

      1. AbelSoul

        Re: define what a "Web Vomponent" is

        The part of a site that makes you puke?

        Perhaps I'd rather not find out.

        1. Anonymous Coward
          Facepalm

          Re: define what a "Web Vomponent" is

          Web Components = more-bloated JS widgets, and more of them on every page. 'Cause it's easy-peasy, man!

    2. hatti

      Re: Janky

      Clone of Wanky

    3. dogged

      Re: Janky

      Janky is a pretty good word for it.

      Load, for example, any article from (let's pick a serious offender) The Verge. The page loads, the text loads, there are big holes where the pictures should be but who gives a shit, you can read the text, right?

      So you start reading. 6 seconds later BOOM you're at the top of the page because a graphic loaded. You swear and find the bit you'd got to and then BOOM another graphic, back to the top. Continue for probably another three pictures.

      Janky.

      1. Jason Bloomberg Silver badge

        Re: Janky

        Jerky and wanky. Very apt.

      2. Robert Grant

        Re: Janky

        What Car? never stop resizing the page and jumping you around on mobile.

    4. JLV
      Headmaster

      Re: Janky

      our side of the pond.

      janky

      (adjective) inferior quality; held in low social regard; old and delapidated; refers almost exclusively to inanimate material objects, not to people

      Urban Dictionary. 3000+ votes on first entry, so likely not a word made up by some basement dweller looking to improve his score on UD.

  3. Michael Thibault
    Joke

    This AMP-HTML...

    Does it go to 11?

    1. king of foo
  4. Christian Berger

    If you want to optimize something, go for code size...

    ...both in the webpages and on the browser. More HTML/CSS/JS code, particularly on more domains, kills performance. Many web pages now load _much_ slower than full sized screenshots of them!

    More browser code means more bugs and makes it harder to optimize the browser.

    1. James 51

      Re: If you want to optimize something, go for code size...

      If we got rid of ads, particularly the video ones that automatically start and take over the phone's screen, that would be a good way to speed things up. Or just design the sites correctly in the first place. HTML/CSS could be used to optimise sites for mobile but they aren't. We could use the tools we have better rather than create new ones.

      1. eddiejp

        Re: If you want to optimize something, go for code size...

        I try to read The Independent every day, but it's becoming more and more difficult with auto-play video ads, full screen ads and so much jankiness that you lose your place at least a couple of times. Case study on how not to do it.

  5. Anonymous Coward
    Anonymous Coward

    Interesting. Probably won't help

    Today JavaScript is blazingly ridiculously fast. Added to that, Chrome/Firefox are rapidly mashing out newly standardised APIs which make common pathways magnitudes faster. For example doing image work in separate processes and scheduling it so the main-thread can wait for the promise to resolve while staying responsive.

    The DOM (and CSS parsing, layout, rendering, and all that jazz) is fast enough, but it's a massive foot-gun. More like a foot-RPG actually.

    The main problem with the web are the advert + tracker infestation. Advert code, social media buttons, and all that embedded cruft is like someone took all the web performance best-practices, and coded the exact opposite.

    The other is jQuery wielding hipsters hammering on the DOM, without a passing awareness of how browser's graphical subsystems work, and pulling down 100+ HTTP resources, because networks are magic.

    With CSS3 + requestAnimationFrame it's perfectly possible to add hipsterific features such as lightbox or image carousels which have native-like responsiveness.

    Will restricting features and managing the renderloop be enough to allow hipsters free-reign to sprinkle their shiny plug-ins, yet still perform like it was built by a front-end engineer who actually knew what they were doing? Either way, I'm sure the adverts will find a way to crash the party as always.

    1. ChrisLaarman

      Re: Interesting. Probably won't help

      1) I agree with this one.

      I'd like to add:

      2) Many web editors insert lots of code, even if one (well, me) tries to avoid this. It makes me stick with an obsolete one (and partly therefore with its operating system).

      3) On the other hand, aren't chances that rising transmission speeds will have resolved the latency before the cause has been fixed?

    2. breakfast Silver badge
      Boffin

      Also Android devices suck at JavaScript.

      In most places you are correct, but this seems to me to be a move they are making because JavaScript on Android is horribly sluggish. A modern Android device is an order of magnitude slower at rendering JavaScript than an equivalent iOS device - the Discourse team give a good idea of the state of play - and rather than solving the problem with the rendering stack they make the call to exclude JavaScript.

    3. The Indomitable Gall

      It's not just about speed though.

      The big issue with JavaScript isn't that it's slow, it's that it can easily break a page-viewing experience.

      JavaScript that tries to dynamically resize a page can flip you out of your reading position if you accidentally tilt your phone/tablet. Or worse -- there's something you can't quite read, so you zoom in for a closer look and BAM! -- the whole thing repaginates and you have no idea where you are or what you're doing.

      I don't think they're really trying to "kill" JavaScript, just trying to get people to use CSS for text formatting and keeping JavaScript to content.

    4. Dakuan

      Re: Interesting. Probably won't help

      React.js can help with some of that layout thrashing. The jsx syntax makes it easy for designers and front end engineers to work together effectively as well.

  6. Mike Bell

    Well, Google...

    The biggest mobile performance improvement I've seen in recent times is ad and tracker blocking on my iOS devices. Easy peasy.

    1. Anonymous Coward
      Anonymous Coward

      Re: Well, Google...

      Perhaps that's one of the reasons they're pushing this now, before adblocking gets entrenched on all iOS devices.

  7. Anonymous Coward
    Anonymous Coward

    "jarring experience of ??? scrolling...."

    Whatever the word means, In my experience page loading delays (crap connections excepted) are almost entirely due to Ad network response times.

    Killing javascript usually sorts that out : does this new way provide for a similar solution (he naively asks)?

  8. Anonymous Coward
    Facepalm

    Another standard?

    Why bother. This one would suffice.

    1. Solmyr ibn Wali Barad

      Re: Another standard?

      Seconded. Kill JS and CSS as many times you fancy -stake them, behead them, burn them, slice them, in sequence and in parallel.

      But leave HTML alone.

      1. lurker

        Re: Another standard?

        "Seconded. Kill JS and CSS as many times you fancy -stake them, behead them, burn them, slice them, in sequence and in parallel."

        I see a lot of people ranting about removing JS without really appreciating what that would mean. Javascript (or more specifically some form of in-page access to XMLHttpRequest or a mechanism which does the same thing) makes a huge difference to how web applications are built. Without these, we would be back in the mid-to-late-90s world of having to press the big old HTML SUBMIT button in order to have anything on any web page change, and full-page reloads for any change to the page content.

        Speaking as someone who has been a web application developer since before either javascript or CSS were things, I'm not a huge fan of either javascript or CSS as a language or markup respectively, but that cat is most definitely out of the bag now and cannot be put back, and if they didn't exist similar functional alternatives (with their own sins and issues) would be required. I can only guess that people making these 'remove JS' requests are either wearing rose-tinted nostalgia glasses or lack understanding of how modern websites work.

        The project in question isn't talking about replacing javascript anyway, it works simply by using a single 'amp' javascript library to which it expects developers to stick with the goal of increasing performance.

        It's not going to work though, projects like these stem from a naive engineering desire to 'refactor the web', and big as google and it's little coalition of the willing may be, I can't see this succeeding.

        1. Solmyr ibn Wali Barad

          Re: Another standard?

          "Without these, we would be back in the mid-to-late-90s world"

          Uh-huh. Would that be a good thing or a bad thing? Rhetorical dig of course.

          HTML was quite sufficient for pure information delivery. Which in turn was/is sufficient for a numerable group of people. Those who fancy Web 2.0 interactive malarkery were a bit neglected back then. Not anymore.

          "of having to press the big old HTML SUBMIT button in order to have anything on any web page change and full-page reloads for any change to the page content"

          Good for a frequently changing page. But why use a metric fuckton of scripts in the otherwise static page? Often messing with the page loading, thus preventing people from getting to the information therein.

          Let's sum it up tidily: healthy use of whateverscript is tolerable, abuse is not. And it's usually the abuse that causes bad blood and angry comments.

    2. Shadow Systems

      Re: Another standard?

      Thirded, Fourthed, Fifthed, ... Hell, I'd upvote this to 9,001 if I could.

      As MsKnight said in the beginning of this forum, code for a 56Kb dial up modem & it'll be fast on damn near everything else you may ever encounter. Slow WiFi? Flaky DSL? Dial Up? Comcast "Broadband"? No matter *how* shitty your connection, the page will fly from server to visitor & render like a champ.

      Limit the graphics where possible, limit the annoying & assinine crufty bits, take out the stupidity with a chainsaw & extreme prejudice. The results will be a site that loads fast, clean, & will be the very definition of "Responsive".

      Basic HTML + DHTML can do a hell of a lot of things in a lot smaller code-size than the same bits "requiring" all the JS/CSS/cruft of today's supposedly "Responsive" web.

      If the Reg is any indication of speed, then I can only shake my head in dismay & disbelief over how far the web has fallen under the guise of "progress". Between all the ad servers trying to force their crap down the pipe, the back end code trying to load a bazillion widgets & plug ins, all the Social Media bits trying to auto-connect back to their respective MotherShips, et alia, by the time the *CONTENT* has loaded I've nearly forgotten why I was visiting the site in the first place.

      I've seen better, cleaner, & more responsive HTML from a Geocities site written by an AOLuser.

      Pathetic.

      1. Solmyr ibn Wali Barad

        Re: Another standard?

        "If the Reg is any indication of speed, then I can only shake my head in dismay & disbelief"

        El Reg isn't that bad. And still allows to read stuff without JS & plugins enabled in the browser.

        Now various support sites... Hit the page, grab some coffee. Look at the little cogwheels spinning. Have a sip. Still loading. One more sip. OK, here's the page. About f-ing time. There's a shedload of pages to go through, and only so much coffee one could ingest during the day.

        What did those scripts really do? Oh. So two people have shared the page in Facebook and one in Twatter. Zero for Reddit, Tumbler, ButtPlugg, Wankr and other 37 social connectors. Gee, it seems that this RAID driver page isn't as popular as one may have thought.

        No, I do not want to participate in the satisfaction survey. You really wouldn't like the answers.

        1. GrumpenKraut

          Re: Another standard?

          > No, I do not want to participate in the satisfaction survey. You really wouldn't like the answers.

          So you would participate if the was cattle prod over IP.

  9. thames

    Some Actual Numbers

    Firefox has some very good web developer tools. Here's a brief summary of what happens when the article web page is loaded: All sizes are in uncompressed form with an empty cache.

    Totals:

    194 requests

    2,974.75 KB

    22.43s

    Of that:

    HTML: 16 requests, 312.91 KB, 22.41 sec

    CSS: 2 requests, 72.37 KB, 0.35s

    JS: 44 requests, 1,564.87 KB, 14.48s

    Images: 124 requests, 1,024.60 KB, 20.45s

    Flash (I don't have Flash installed): 4 requests, 0 KB, 2.75s

    Other: 4 requests, 0 KB, 0.01s

    Of the HTML, here are the top domains in terms of size along with transfer time:

    Twitter 50.16KB 0.117s

    Facebook 62.54 KB 0.343s

    theregister 67.34 KB 0.208s

    For the Javascript, 181.13 KB of that came from theregister.co.uk

    For the images, the slowest were from the following domains:

    cs.meltdsp.com 0.34KB 5.110s

    pixel.eversttech.net 0KB 5.100s

    cm.dpclk.com 0KB 10.096s

    A lot of the images seem to be very small tracking pixels. The same is true for the Javascript, much of it is very tiny (likely cookies). I currently have 50 cookies set just from loading the web page and logging into the forum. Only a couple of those belong to El Reg. Making visible page images smaller or active page Javascript more efficient isn't going to change anything that matters here because they're not the slow or large parts.

    Here's the thing that nobody seems to want to admit. Loading a web page from The Register isn't the slow part. That is very fast. It's also not very big. The problem is the ad networks. There are multiple ad bids, ads, tracking cookies and pixels, and other crap being loaded, often taking a very long time to do so (e.g 5 to 10 seconds each in several cases).

    Fiddling with The Register's web page isn't going to do anything significant. For Christ's sake, Twitter and Facebook load as much HTML to do their tracking buttons as El Reg does to display the actual article! And there's roughly 3 dozen other parties all loading their crap, very slowly, into the page.

    Fiddling with the size of the images won't help either. Many of them seem to be used as tracking elements by the ad networks, and even very tiny ones take forever to load if they come from an ad network rather than The Register.

    The real problem is that ads are served from third party networks, and those ad networks don't care if The Register is slow, it's not their site after all. Instead, they spaff loads of crap into the page, very, very slowly.

    The real solution isn't going to be fiddling with the margins. It's going to require restructuring the ad business in order to put the content publishers in control so they can optimise the entire process, just like they do in physical print publishing. I'm not sure how to do that, but I don't see any other way.

    1. Yeti

      Re: Some Actual Numbers

      Thanks for the analysis, I just added the three bastards trackers to my hosts file (twatter & FB are there already). The load time has improved tremendously.

    2. VinceH

      Re: Some Actual Numbers

      What tool do you use to get those stats? I have one which measures page load times, but the information it presents is very limited compared to that.

      1. TimR

        Re: Some Actual Numbers

        Firefox Tools>Web Developer>Network>Network

        1. VinceH
          Facepalm

          Re: Some Actual Numbers

          "Firefox Tools>Web Developer>Network>Network"

          D'oh! I have it already, but never noticed that aspect of its features.

          Ta.

    3. JLV

      Re: Some Actual Numbers

      Not to mention that if you use NoScript, a typical website doesn't leave you with just the need to assess the trust to grant to their domain.

      No, you have to decide on the javascript of all those other domains. Some of which, like CDN-hosted jquery, are delivering core page-rendering or processing functionality for that site.

      1. Boothy

        Re: Some Actual Numbers

        @JLV

        That's one of my pet peeves, web sites using 3rd parties to provide core functionality.

        In my view, all core JS etc should be hosted on the same domain as the site you are looking at.

        1. JLV

          Re: Some Actual Numbers

          Agree, except...

          In a perfect world, the CDN would report that jquery-3.1.4.js, which El Reg just requested, is the same as jquery-3.1.4.js for wikipedia and return a code 304 or whatever spec says is best. And if it was a different CDN, it should report equivalence as well and sig-check that the js contents were indeed the same, security-wise.

          The whole ...for mobile...--css...--js...++<our_new_spec_> twaddle from FB or Google seems rather self-serving, rather than addressing core concerns with web. css is, from a dev, not designer, perspective, an amazingly powerful way to separate presentation from deeper, business/data layers in a text format.

          And everytime a better .js has been cooked up it's floundered, the asm stuff aside.

          Regardless, as, stated by thames, we have way too many moving pieces that scrape against each other.

Page:

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon