To be fair, there's a REASON we use libraries that contain features we don't necessarily use.
Fair? How about "to be even modestly rational"?
I don't know why these "so-called experts" aren't tapping out their own silicon, then writing their own assemblers, compilers, OSes, network stacks, HTTP servers, TLS implementations, DBMSes, web browers... Why, those things are just chock full of features most applications don't need.
Yes, there's much to be said against bloated, poorly-written libraries.[1] Anyone who pays the slightest attention to information security understands the problem of a broad attack surface. But "write everything yourself" is not a good answer.
I've written my own Javascript library, with just the functionality I require. It's a useful exercise but hardly appropriate for every web development team. I've written two HTTP servers that appear in commercial products, and I don't recommend it as a general practice. Security is about economics, and sisk points out, so is software development. An app that never ships is relatively secure, but it's not improving the overall state of the software ecosystem.
[1] And that does describe jQuery for at least the first half of its existence. I haven't looked at the jQuery source in years, but for a long time it primarily displayed a grotesque ignorance of the actual ECMAScript specification and behavior of the language. There are famous examples - the expectation that the typeof operator could ever evaluate to the string "array", the assumption that properties would be returned in some particular order - but beyond those the code was rife with amateur foolishness. It was a standing joke on comp.programming.javascript.