If I may just butt in on your exchange with Charles 9, I think the issue is what you understand by the phrase "the hardware itself". The difficulty is in the first word: "the".
Some hardware needs to be trusted. To my knowledge, no-one has found a way of building a trusted plaform on top of an untrusted CPU. At some point, the data has to be processed. Building a transparent hardware encryption of memory is conceivable, but I don't know of anyone who has done it. I imagine the cost (in performance) is a worry and I imagine that replacing "needs to trust memory" with "needs to trust the memory controller" isn't reckoned to be worth the effort. You can, however, build a trusted data volume on an untrusted drive and this is now commonplace.
Once you get to "hardware that you plug in", like USB sticks and eSATA drives, there is an expectation that "the hardware" should not blindly trust "the peripheral" and some bus architectures have been crtiticised (well, actually, more like written off as "do not use, ever") on this site and elsewhere for allowing precisely that.
With that context, I'd say it makes a big difference whether the hardware is outside or inside "the box" and that test should be interpreted as "end-user serviceable" rather than taken literally. So the SD card counts as "outside" even if you have to take the case off and remove the battery in order to get to it. The screen, however, is definitely "inside" for a phone or laptop, but would be equally definitely "outside" if it is a desktop machine with graphics card and a cable socket.
There is no shame in building systems that trust the hardware inside the box. There is plenty of shame in trusting hardware outside the box. Vendors should probably design their boxes so that you just need fingernails to access the outside parts but you need a screw-driver (possibly one of those stupid ones that no normal person has) to access the inside parts. Then everything is clear.