Re: I've just uploaded my music collection to...
>>I thought any NAS can do it?
Doesn't that rely on getting a static IP, etc?
6848 publicly visible posts • joined 28 May 2010
I don't think Apple expect people to upgrade their iMacs particularly often. They just keep improving the specs in line with current tech. It's not a bottom of the line beige box which is creaking when you buy it, your Mac should really last until something fails, or they drop OS support, unless you run particularly demanding software.
>>Apple fanbois may disagree and make what could well be the last ever Intel-powered iMac upgrade a collectors’ item. ®
This, IMO. I rushed to buy the Mac Mini 2012 for similar reasons when they ruined the 2014 iteration. I'm still on that MacMini and I suspect I might hold out until the final Intel model (rumours of a refresh are rafe) because I have no idea how running Parallels is going to work on Arm.
>>except, possibly, to that contingent which possesses 'open-carry' or 'concealed-carry' weapons permits.
Those people are typically right wing conservatives, who are the group most opposed to this sort of tech. You've really made yourself look quite ignorant.
OK sure I can see that running legacy software is a legitimate use case, butat the rate this is going by the time it's actually anywhere near usable on a specific version of Windows:
a)Everyone's lost interest in the legacy software or replaced it
b)MS has open-sourced that version of Windows
I'm sure it's really interesting to work on but this seems in reality an educational experience. If you have really genuinely mission-critical legacy apps, are you going to trust them to an untested Windows clone? I love that people do this kind of project, don't get me wrong.
>I have a lot of trouble seeing anything in .NET that offers any real advantage to developers on any platform.
C# is a really good language, .Net is a pretty comprehensive set of libraries.
But you seem overly focused on GUI which is a pretty tiny part of modern software development.
I would suggest that most people working on Linux are used to using other tech so why would they take the time to learn .Net? And those already on Windows are unlikely to move to work on Linux.
I would certainly look at .Net on Linux but I am unlikely to get hired for Linux work as all my background is on Windows.
The key problem is that the dataset includes, for example, pictures of Black people and monkeys labeled with the N-word; women in bikinis, or holding their children, labeled whores; parts of the anatomy labeled with crude terms; and so on – needlessly linking everyday imagery to slurs and offensive language, and baking prejudice and bias into future AI models.
I don't think I'm a particularly sheltered internet user but I've never seen a photo of a black person or a woman with a text label such as this baked in. In fact photos of things with the words describing them on top are not a common occurnace at all.
Did I misunderstand?
Only if you use it to store media I would say, or take a lot of videos. 64Gb will hold a LOT of photos. I might be tempted to spend the extra cash on an SD card rather than the 64GB if budget were that tight?
Anyway this sounds like a seriously good bang for your buck. And an IR blaster... why is this not more common?! My TVs all claim to support mobile-link via WiFi but it is a waste of time whenever I tried.
None of my Android apps on my home screen offer content except a notification dot, and the calendar app icon tells you what day of the month it is (which I only just realised).
Unlike Apple, where you can see how many texts and emails you have from the icon.
I liked Windows Mobile, where you didn't have separate widgets and app icons. A tile could show you genuinely useful information like the weather forecast, emails, etc - there are lots of times I open an app to see one thing that could've been displayed on the icon.
ML learns from historical data to make predictions. If there isn't historical data for a certain scenario, it can't learn from it.
This is the difference between human-designed algorithms, and data-driven machine learning (AFAIK).
Since we haven't had anything like this in the modern, computerised era, humans are pretty much lost making predictions too. We'll have to wing it.
You can, but you shouldn't. I've tried several in the low-price end because I refuse to spend £hundreds and they showed the promise but were not very good. I eventually got some for £35 that are actually pretty nice. As long as I don't bend over, then one of them cuts out.
I am nervous of expensive ones because I don't know they will solve this sort of glitch or if it's a fundamental limitation on Bluetooth.
I was a rather surprsied convert to them. I thought they were stupid but for gardening or DIY work, not having any wire or headband is a big boon for me in terms of comfort and not accidentally pulling them out every 2minutes. They offer quite good ear protection as well.
Not "everyone" has gone the Goople route. Hindsight is great obviously but let's remember Commentards here spend most of their time criticising those two companies above all others for data-slurping.
Other nations ARE running into issues with the Goople approach, and we've seen some countries that have an app, it's not really being used.
Wearables are being discussed as a better idea it seems. I would wonder if in future, all new smartphone models will include dedicated, accurate distance-measuring stuff to some new standard?
In the extent and detail spent talking at a fairly technical level (in high level terms obviously).
The issue "the Goople distance calculations aren't good enough" seems a reasonable one to make a big deal about, given that as Hancock pointed out you are asking people to self-isolate based on this. If you spent 15minutes 3m apart on a train you do not want to be told to isolate.
All the political recrininations aside, I'm curious about the "we'll share our stuff with Goople" angle. Is that a real thing or PR fluff?
Also, do the crowd here thing an app IS the right approach? Bluetooth seems a bad technology. Will newer phones start to include dedicated hardware just for accurate distance measuring?
I remember when they confirmed the first exoplanet (90s) which means I remember the time when it was unknown if planets like earth were very rare or even unique. I'm not even that old... now we not only take their existance for granted, we know they are pretty commonplace.
How drastically this particular field has evolved in only 20ish years is pretty amazing, many of you have children who are older, when one considers what a huge impact it has on our view of the universe.
master
with main
across its services
All true but black people have been systemically the target of racism from whites in the west, seemingly as a direct hangover from slave-trading days.
And not to be glib, but white-on-black racism is well, more visible, than racism between groups who look similar to each other.
I would guess people don't want to have to do boring work implementing this change just for PC's sake, but for genuine reasons.
If everyone starts using 'main' for new projects then great. It's a good name and 'master' is to an extent less applicable in these days of DVCS I'd suggest anyway. If people are going to push for existing projects to be changed, that rather needs to be justified.
FAirly neat but there's still going to be 12 thousand(!) of these blighers? They're used to provide some sort of giagantic space-WiFi array, is that right? Serving who -regular home internet or some more special purpose? Is anyone paying or asking for this service or is Elon assuming "if you build it they will come" and we'll all be signed up to MuskNet in 20 years... this sounds very reminiscent of a Doctor Who storyline (Tennant/Simm) to me.
Is nobody going to mention this...
Anyway it's only a matter of time until AI can fly rings around humans even if we take physical advantages out of the equation. One assumes they've already got it working in simulators - in fact are there no video games featuring highly skilled AI pilots?
Seems not dissimilar to AI controlled race-car driving - that is proving a hard problem but the gap is closing.
Of course, human pilots are already entirely dependent on computers. Not just basic stuff like constrant trim microadjustments but targeting as well and presumably many other things.
>I'll display my whitespace how I choose, thank you very much.
That's nice, but for those of us who don't work alone in their bedroom on code nobody else will ever touch, people like to bandy around terms like 'consistency' and 'standards' .
Or maybe you locally re-format every code file each time you check it out so it's almost a per-user setting anyway and the precise whitespace details inside source-control are not important?
It's not supposed to show you things similar to what you watched. It's supposed to show you things you might like based on what you watched (it assumes you liked the things you watched).
Netflix have invested millions into this algorithm - they ran a contest for programmers to tune it years back.
Have you tried watching suggestions? If you do, you should rate them and it will get better.
I have often been put off shows recommended, only to find they ARE exactly something I'd like.