back to article HTTP-on-steroids busts out of Google

Strangeloop – a Vancouver-based outfit offering an online service for accelerating website load times – has embraced Google's SPDY project, a new application-layer protocol designed to significantly improve the speed of good ol' HTTP. The Canadian company claims that it's the first outfit to offer a commercial product to let …

COMMENTS

This topic is closed for new posts.
  1. Anonymous Coward
    Pint

    Well

    This all sounds interesting, but Bixby loses credibility saying things like "On the other side, you look at the evolution of TCP and HTTP over the last twenty years, and it's tough to argue that things are progressing. There has been almost no significant changes to those protocols to help with the bottlenecks we're facing"

    Just off the top of my head, the RFC for HTTP 1.1 was 1999, ie, 12 years ago. If he's unaware of that, and the performance benefits that offered over HTTP 1.0, then I can't see he's in any way qualified to comment on what improvements have been made to TCP or HTTP in the last 20 years to combat bottlenecks.

  2. Pat 11

    10 - 20% meh.

    Goodness, what will I do with all that time.

    Fail...they should just make browsers play relaxing elevator music while pages load.

    1. Anonymous Coward
      Anonymous Coward

      Re: 10 - 20% meh.

      This isn't really for the benefit of the end users (although they might like to say that), it's for the people running the servers, why else do you think Google developed it?

    2. durandal

      Deploying smooth jazz

      in 3... 2... 1....

  3. Anonymous Coward
    Anonymous Coward

    Proxy Plug-in

    This would be interesting if there were plug-ins for the various proxies, e.g. Squid. Then a site (e.g. an office, not a web site) could deploy the plug-in on a proxy at the firewall, and accelerate the connections over the (relatively slower) WAN connection, while allowing the clients on the (much faster) LAN side to remain unchanged.

    AND, were the plug-in also able to operation in accelerator mode, an existing web site could apply the acceleration on the accelerator proxy, and not need to modify the existing server.

    (of course, if web masters would stop including Javascript from ten unrelated sites (e.g. Twitter, Facebook, etc.) then the pages would probably load MUCH faster yet.)

    1. mittfh
      FAIL

      Never mind Social Media plugins...

      NoScript + Ghostery tend to reveal up to a dozen different ad providers / trackers on many sites - many also design their websites so poorly that they're virtually unreadable or unnavigable without JavaScript enabled from three or four different external sites...

      1. A J Stiles

        Indeed

        I run my own DNS, and so can block the unwanted advertisement and tracking servers at the nameserver level.

        Other people's wi-fi connections always seem slow by comparison .....

  4. Anonymous Coward
    Anonymous Coward

    Welcome to the Google's Web

    You know the one with protocols that aren't actually properly documented - no, you can't call a document with TODO notes as proper documentation.

    Oh and it's great for ads because no matter what your ad blocker says, the server will push them through anyway, to "optimise performance".

    1. DrXym

      This is the thing

      SPDY probably has a pile of merits. But it needs a proper review and standardization process to make sure the protocol is sane and well specified. Just like WebM, WebP, NaCl etc. etc.

      It will be other browsers that suffer if they struggle to implement half baked badly documented specs that Google can change on a whim.

      I also suspect that SPDY / WebP is the precursor to Google going after Opera's remote proxy business.

      1. nyelvmark
        Joke

        NaCl is a standard?

        How do I check whether mine is compliant? Is there a standard chip to test it against?

        What about CH3COOH?

        1. DrXym

          Google would like it to be one

          Google are promoting NaCl and PNaCl with the aim of making them standards and have open sourced the efforts. I think NaCl is a hack because it uses native instructions rendering it useless for cross architecture apps. But PNaCl is LLVM based and has long term promise.

          But it needs to be formally defined and subject to review. The APIs that apps can see, the security model, the permissions, the multithreading, the interaction with the DOM, storage etc.

          Just shoving out some open source reference implementation or "trusting" Google with the standard is not acceptable for something which other browsers would need to implement. It virtually guarantees Microsoft and Apple won't play ball. Mozilla turned down a similarly half baked spec such as the recent WebP image format.

          1. ratfox
            Coat

            PNaCl

            Sounds poisonous to me. I want Phosphorus-free salt, thankyouverymuch!

            The one with the periodic table in the pocket, thanks

  5. nyelvmark
    Unhappy

    HTML optimisation

    "And any trick in the book we can use to make the HTML faster, we use it."

    Hmm. A good trick, provided the resulting code still validates and still behaves properly in all target browsers (which probably includes IE6). That's hard enough to do without optimization, so I'd be interested to see what this does to those carefully-designed pages. And what about all those webpages out there that invoke quirks mode in the browsers (i.e. they contain invalid code). Will SPDY sort these out, or will it fall back to taking no action?

    I first wrote in HTML when nearly everyone was connected through a 56k modem, and I used a rule of thumb that a page shouldn't be larger than 20kB, including any resources (images etc) that it loaded. I never found this much of a limitation, although I wasn't, of course, displaying Flash animations and video advertisements. I don't know what I'd use as a rule of thumb today, but it can't be difficult to work out the maximum size a page can be before your target customers start losing interest while it's loading.

    It seems to me that technology like this is simply encouraging bloated website design.

    1. farizzle
      Thumb Up

      I agree somewhat..

      56k dictated the page size, the high latency ensured that size was chopped down even more, and as a result the smaller the page generally the better the experience.

      I don't have figures to compare latency over 56k connections over ADSL or cable connections these days in the uk, but i guess google is trying to hide that latency by rewriting that protocol.

      If people stuck to writing efficient code, this wouldn't be so much of a problem...

      1. nyelvmark
        Boffin

        If people stuck to writing efficient code...

        What do you mean, "efficient code"? HTML? CSS? Javascript? All irelevant in terms of page load time. It's the binary data (images/videos/Java/Flash) which causes the problems. Text is very easy to compress. HTML, CSS and Javascript even easier (because it's easier to predict what they contain). Compressing stuff which is already compressed, like jpeg images or mpeg video, wastes CPU cycles and often results in negative compression (the output file is larger than the input file).

        The worst problem I've seen was a homepage which contained a few paragraphs about the author, and a picture of him. The image size was constrained by the layout to little more than thumbnail, but the source of the image was a 10MB bitmap, which the browser had to download and then render into the <img> block, throwing away 97% of the information downloaded. I fixed this by by converting the 10 MB BMP into a 50 kB JPG. You won't be surprised to hear that the client objected, saying he couldn't see any difference...

    2. Rob - Denmark

      The title is required, and must contain letters and/or digits.

      "A good trick, provided the resulting code still validates and still behaves properly in all target browsers (which probably includes IE6). "

      Since IE6 don't support the new protocol, data for IE6 would still be transmitted 'the old way'.

    3. TeeCee Gold badge
      FAIL

      Re: HTML optimisation.

      "Will SPDY sort these out..."

      You didn't read the article then? That said that Strangeloop use a variety of optimisation techniques of which SPDY is one and one specific to Chrome at that. So no, by definition it won't sort out anything at all related to browser compatibility since none but Chrome will ever see it used.

  6. Anonymous Coward
    Anonymous Coward

    "the bottlenecks we're facing"

    bottlenecks are simply unnecessary page bloat plus shedloads of ads. Scrub the ads = instant turbo. For free. Scrub jscript as I do and you've closed many security holes (not least cos 1/2 the pages won't behave correctly, idiots...)

    Some knowledgeable ones on slashdot got stuck into the details recently and it wasn't pretty. My own question is, much of SPDY looks like a well-established protocol called BEEP, why not use that?

    From <http://www.ietf.org/proceedings/48/SLIDES/beep-review/index.html>

    "

    BEEP is an application protocol framework for connection-oriented, asynchronous, request-response interactions. This particular subset supports a large class of Internet applications, and provides solutions to common design issues for those applications, including: framing, segmentation, structuring, and multiplexing of messages, along with authentication and privacy.

    "

    Hello google architect, why not BEEP?

    1. Alastair 7

      Uh...

      "Scrub the ads = instant turbo. For free."

      You do know why the ads are there, right? To earn money. Taking them away is not "free".

  7. kain preacher

    Let me guess

    Works best if you use chrome.

    1. Rob - Denmark
      Stop

      Good guess!

      Well:

      "At the moment, Google Chrome is the only browser that uses SPDY on the front-end."

      You really get a lot of information, if you read the article.

  8. Graham Marsden
    Coat

    Mein Fuhrer!

    I can download quicker!

    - Doctor Strangeloop

  9. h4rm0ny
    Headmaster

    Interwebs? INTERWEBS?

    Please tell me I didn't just see that word used in a NON-ironic manner. I weep for IT journalism.

    :_(

  10. Anonymous Coward 99

    Rollout?

    HTTP Compression came along years ago, and if anything would have been more use then. But how many sites enable it?

  11. Red Bren
    Headmaster

    Going the way of thrice?

    What ever happened to "twice as fast"? It's got three less characters to type than "two times faster"

This topic is closed for new posts.

Other stories you might like