back to article FTP celebrates ruby anniversary

File Transfer Protocol (FTP) marks its 40th anniversary on Saturday (April 16). The venerable network protocol was first proposed by Abhay Bhushan of MIT in April 1971 as a means to transfer large files between disparate systems that made up ARPANet, the celebrated forerunner to the modern interweb. The protocol required a …

COMMENTS

This topic is closed for new posts.
  1. Roger Grumble

    not silver

    40 years would be a ruby anniversary. Silver is 25 years.

    1. John Brown (no body) Silver badge
      Happy

      re: not silver

      Making it even more relevant since most of the FTP traffic is pushed around the intertubes by lasers these days (ok, so it's not your actual ruby lasers anymore but...)

  2. Lee Dowling Silver badge

    Worst protocol in history

    FTP shouldn't be celebrated, it should be binned.

    Still most places that use FTP are using plain-text logins. The "secure" FTP is basically established a secure tunnel and then talking old-style plain-text FTP over it - nothing to do with FTP itself - or one of three OTHER secure protocols that are all quite similar bodge-jobs.

    FTP is one of only a handful of protocols stupid enough to try to embed IP addresses in the data stream (which is an OSI layer violation for a start, and causes no end of problems with all sorts of systems). Every NAT system on the planet has to have a special exceptions module to handle FTP data connections.

    It also doesn't standardise file-listings (which can be literally a "dir" or a "ls" command output, or anything in between, and you have no idea what until you start receiving it) and don't even get me started on the SITE command abuses that are possible.

    It's an abomination that should never have come into common use and should have died years ago. But still it's used and even seen as a "reliable fallback" by some people. Kill it off now. It shouldn't have made 4, let alone 40.

    1. Aaron Em

      Hear, hear!

      It's been forty years. No legitimate excuse remains.

    2. Chris Miller
      Happy

      Anyone who complains of

      'an OSI layer violation' should be condemned to use GOSIP and X.400 for the rest of their lives. In any event, FTP predates OSI by about a decade.

      Oh and it's NAT that broke FTP (and many other protocols) rather than the other way round.

    3. Ken Hagan Gold badge

      Bad, but there's probably worse out there.

      "an OSI layer violation"

      Forget the OSI bit. It's a layer violation and therefore offends against any sane software engineer's sense of aesthetics. But it's not alone in that regard.

      "Every NAT system on the planet has to have a special exceptions module to handle FTP data connections."

      Yes and no. NAT is an abomination unto <deity> and the required exceptions are the penance that we pay for using it. (Bring on IPv6 and replace all the NATs with firewalls.) On the other hand, FTP embeds IP addresses *as text* and therefore the patching changes the size of packets and requires that the TCP stream be completely rebuilt as it passes through the NAT. That adds a whole new dimension of pain for the implementor and makes FTP one of the worst protocols in this regard.

      In FTP's defence, it is documented and if you follow the spec then your implementation will work with other people's versions. I can think of file transfer protocols that aren't so smart in this respect.

    4. copsewood
      Thumb Up

      I use SFTP regularly

      That's FTP Wrapped in SSL using the SSH toolkit. I don't allow uploaders to my shared web host to use vanilla FTP, because that exposes passwords and even Windows has a GUI drag and drop SFTP client (Winscp). I think either SFTP must rework the FTP protocol somewhat to avoid the IP passing layer bodge, or the packet rewrap to handle NAT readdressing must happen at the SSL layer and not in the router as the router doesn't know the SSL crypto key. Most of the time SFTP is wrapped either in a nice GUI drag and drop client when used manually, though the command line version is easy to wrap into shell script automation and fine for the occasional one off upload/download when working at the command line.

      What are the alternatives ? If you are supporting Windows LAN file sharing in native mode you are likely to be using SAMBA or native CIFS. NFS, is probably still your best bet for sharing filesystems within a high performance Unix/Linux LAN cluster under common admin. Most of my automated filesharing between systems involves RSYNC over SSL because for backups it's just the differences that go over the WAN, only the whole long caboodle at initial setup time. Casual distribution via the web uses HTTP of course and groups of people who want to share content who don't know each other will tend to use BitTorrent. But I think SFTP still has its uses, particularly for uploads to and maintenance of shared hosted servers.

      1. Allan George Dyer
        Boffin

        SFTP isn't wrapped FTP...

        and if you're using SSH, I doubt that SFTP is using SSL.

        SFTP was designed as an extension of SSH to provide file transfer and management capabilities. Fortunately, it doesn't inherit the "features" of FTP at all.

        Last time I used FTP was to get some publicity proofs from a marketing company, they had a shared username/password for everyone accessing the host.

        Anyone still running an FTP server should not be trusted with anything confidential.

    5. Anonymous Coward
      Anonymous Coward

      Sort of disagree.

      FTP was a great way of getting files around when your alternative was Email.

      Probably shouldnt have survived the 80's though.

      FTP/Telnet/Gopher/NNTP were designed for simpler days. They were revolutionary at the time. Of course, those days were a long time ago.

      1. ChrB
        Heart

        Telnet...

        ... is still my everyday tool to diagnose network hassles.

        Pity is - yeah, I'm a Windoze guy - it is no longer included in the plain vanilla install of W2K8. But edlin is... - I mean, come on, edit.com is fine but edlin? But that's a different story.

        Cheers!

  3. Anonymous Coward
    Anonymous Coward

    Spawned the first Internet search engine...

    "Archie" came about because someone found that maintaining listings of FTP sites became too much of a chore using command-line tools. The first "Archie" server was hosted on a Sun-4/280 running SunOS 4.0. (Not yet a silver anniversary.) Happy days!

  4. David Harper 1
    WTF?

    GUIs are for wimps

    "Early versions of FTP were command line based, restricting mainstream use of the technology until the advent of the first browsers around 1992."

    Only if by "mainstream" you mean "clueless lusers".

    Those of us who used computers professionally before the world-wide web came along were perfectly happy to use command-line tools. In fact, many of us still prefer the command-line, because it gives us a level of control and flexibility that simply can't be had from GUI tools.

    And yes, I am a Unix geek. Why do you ask?

    1. Aaron Em

      Bragging about CLI usage is for wankers

      It's not that you are a Unix geek; it's that you think Unix geeks are the only people who should be allowed to use computers. That's not to do with being a Unix geek; it's just you being an asshole.

    2. Anonymous Coward
      Anonymous Coward

      Dear dear dear

      Any geek worth his salt would call himself a UNIX geek.....

      Not a Unix geek.

      Kids these days :-)

  5. Tadas Jelinek
    Stop

    Congratulations!

    Now die for pete's sake!

  6. Peter Gordon
    Thumb Up

    Re: Worst protocol in history #

    Having written an FTP client completely from scratch, I agree with a lot of what Lee says.

    Still, happy birthday FTP.

  7. No, I will not fix your computer
    Heart

    I Love FTP

    It has some obvious advantages;

    1. It's almost always available

    2. It can do dos/unix/ebcdic translation

    3. It's very efficient/lightweight

    4. Ability to re-start a transfer without losing the data already xferred

    5. Anonymous FTP sites are easy to configure and maintain, ideal fro "drop" sites

    Some non obvious ones;

    1. You can use it for cross-machine directory structure transfers under UNX (like a port forward)

    2. You can use SSH to just encrypt the control connection (useful for low-power boxes without the overhead of encrypting the data, if you don't want)

    3. You can use it to remotely view or create files dynamically (you don't have to transfer/save/view or edit/transfer)

    4. Scripting is possible, either with clunky shellscripts or programming languages that have APIs such as AutoIT/Visual Studio/Perl etc.

    Some obvious disadvantages;

    1. No standard encryption (use sftp/scp instead)

    2. Issues with firewalls and data connection (use passive transfers instead)

    3. Some reconnection security issues (disable them)

    4. Hetrogenous systems may have issues such as bin/ascii default, odd dir listing formats (true, but these shouldn't be a surprise, but a quirky transfer method is better than none)

    Is lack of security an issue? to snoop you'll need elevated rights on one of the machines or physical access to the network, this is harder on switched networks, not impossible of course, but there are easier ways of getting passwords such as keyloggers, so if you're transferring files around in a secure datacentre or home network, perhaps there's no issue. simple configurations like using chroot ftp and accounts that have no ssh style logins dedicate it's use, and of course linked into a token based authentication system means one-time passwords (or, at worst vulnerability windows of a few seconds).

    All that said, it's horses for courses, I use https, ftp, scp, rdist and smb at home for all my file transfer needs, balancing performance with security with ease of access, I wouldn't want to only have ftp available, but I wouldn't want it to not be available either.

  8. ChrB
    Pint

    Still widely used

    You wouldn't believe how many FTP-connections we have in place to various business partners. And the number is still growing. Many banks run FTP-based platforms to exchange data bulk-style. When securing the payload using PGP or such there is no easier way (yet) to do this.

    I tend to think that the simplicity of setting it up is it's most advantage compared with other communication methods (like AS2, ConnectDirect, MQ - of course these provide more nifty neat features but one rarely really needs them in our scenarios).

    Best of FTP is surely SFTP where only one channel is required; no hassle with NAT and stuff.

    So, cheers then!

  9. /dev/null
    FAIL

    Some confusion here

    RFC 114, published in April 1971, defines a file transfer protocol for the ARPANET based on the old NCP protocol and preceding the TCP/IP-based Internet by about a decade.

    The TCP/IP FTP protocol we all know and love, first published in 1981 as RFC 765 and further refined by RFC 959 in 1985, is a fish of an entirely different kettle.

    And what's Gopher got to do with it? Gopher came and went in the early 90s.

    1. Anonymous Coward
      FAIL

      Hang on, what about RFC354?

      The initial series of responses and revisions to rfc114 went on throughout the remainder of 1971 and into the early spring of 1972. The outcome of this was Abhay Bhushan's revised proposal in RFC354, dated July 8th 1972. Although it operates over ARPA IMP-based connections rather than TCP/IP, it's still very clearly the same FTP that we all know and love to hate:

      - it has the telnet-based command connection and separate data connection architecture

      - it uses the familiar ASCII commands rather than the binary protocol of the original (this is where RETR first appears, so its own actual 40th anniversary isn't until next year)

      - it introduces the 3-digit response code format, and many of the status codes that are still in use with the exact same meanings today.

      In fact if you compare RFC765 to RFC354, you'll see that 765 is simply a rewrite of 354 updated to refer to TCP/IP rather than ARPA protocol. The document is structured in the same way, covering the same topics in the same order, and much of the text of 354 is duplicated verbatim in 765.

      Clearly therefore 354 is the real birth of FTP, and 765 is in fact a fish of exactly the same kettle.

  10. Alister

    Still used here

    We still use FTP (over VPN) as the quickest method of uploading large chunks of data to remote servers, no other protocol we have tried is faster or consumes less bandwidth, so I'm quite happy for FTP to continue.

    1. Evan Essence
      Thumb Down

      Re: Still used here

      Less checking gives less overhead, so reliability suffers. IME, it's always necessary to bolt on your own check of the file lengths as transmitted and received to ensure the file hasn't been truncated. Maybe a checksum instead of/as well as. Pretty silly, really, much easier to use a reliable transfer method in the first place.

  11. phil 21

    nic.funet.fi

    i was at uni just before browsers took off (we had mozilla 0.8 but it didn't do much) so ftp was a big thing for me and my atari disk swapping friends back then. much easier to dump an MSA disk image on an ftp server and email the directory link then post a jiffy bag across europe :)

  12. Anonymous South African Coward Bronze badge

    FTP...

    Also FTP here as a fallback when SCP doesn't - which is very rare.

  13. davcefai
    Thumb Up

    In Brief

    It almost always works, you can type "ftp" into just about any shell on any machine and there is nothing else to challenge it.

    Happy Birthday FTP!

  14. ZimboKraut
    Pint

    FTP in todays world has no right of existance

    I have been using computers for the last 30 years, professionally for about 22 years, and have to admit, FTP was very useful and definitely has it's place in the evolution of IT, but in todays world, anyone who implements FTP as a means to transfer files acros a public network should be slapped straight across the face.

    never mind the NAT issues, but clear text authentication, etc. are things that should just not be used in todays world.

    we have tools like SSH (sftp/scp), which are available on pretty much any common platform, from windows, linux, across the board of unices, bsd, symbian, android, etc.

    just like using telnet on a publically available switch/router

    1. Gotno iShit Wantno iShit

      @ZimboKraut

      Sorry but no. davcefai had it correct with "you can type "ftp" into just about any shell on any machine" , the secure alternatives you list do not have this advantage. The others you mention may well be available across the board but available out there somewhere on the wobbly wild web is not the same as bring up a command prompt and use.

      As for transfer across a public network why should I give a stuff if someone sniffs the username and password to my server if every item on it is adequately encrypted? Sure you're not going to rely on one level of protection like that for the companies crown jewels or military but for general stuff why the hell not? Often enough there's no need to encrypt anyway, what needs to be picked up has no value or would be meaningless. FTP has the lowest overhead bar none, for occasional file delivery to remote locations an anonymous FTP server still makes a marvellous drop box.

  15. Christian Berger

    Well there are some very interresting ideas behind it

    For example you request a file, then the server connects to the client in order to deliver it. That is ingenious. That way a different server can deliver that file. You can build gigantic server clusters. Each one storing only part of the directory tree.

  16. Dan 55 Silver badge
    Thumb Up

    Still need FTP today

    If only because my work proxy doesn't know FTP exists, so while it's complaining of files bigger than 10Mb through HTTP I can get them through FTP.

    Stuff like new versions of Acrobat and Firefox, Eclipse, Windows updates, and other terrible things which are banned due to enlightened company policy. And then they wonder why their network is virus-ridden.

This topic is closed for new posts.