Re: Question
Didn't BSD switch to LibreSSL a while back because of the problems with OpenSSL?
12180 publicly visible posts • joined 16 Apr 2007
The kernel is different but much of the "system" is the same, the driver model is also different. However, as MacOS is very much a consumer OS, most people will never use the command line or see the system files.
Apple has added some cruft to try and make things more "secure". In practice this can mean some tinkering so that you can admin your own system.
The BSDs all have the same userland and library paths. That makes admin a whole lot easier. In practice, however, you're only likely to come across FreeBSD and OpenBSD, with NetBSD being a niche for those with very exotic hardware.
Package management has gone through several evolutions, but make install from the relevant port directory always works.
Every time I have to do some admin on a Linux system it seems like things have changed just enough to annoy me. Then there's all the fun of Debian vs. RedHat vs. SuSE vs. Gentoo, etc. and all the buggering around with packages that distros seem to love.
I'm looking for something to run on an RPi so that I can move my printer from the desktop. This should be possible in theory with Linux on an RPi but, unfortunately, the printer drivers (Samsung FWIW, it was free at the time works fine with MacOS) can't be installed on the Pi and the quality without them isn't acceptable. It looks like the BSD drivers are in a better state.
BSD is a server OS, which is why it likes to own the partition (it has its own volume manager which has meant flexible volumes for years).
No licence flim-flam, stable as a rock. What's not to like?
Sounds like a case of confirmation bias. I know several companies who are working hell for leather doing in-person stuff to catch up on the stuff that they couldn't do for the last two years.
Remote working works better for some people and situations than others.
Indeed, the potential pay off of USD 1 billion if things don't go through is a lot less than the loss in value of his Tesla stock that is being used to guarantee loans.
The SEC should intervene because interventions like this during supposed due diligence quite clearly move the stock price by interested parties.
The article mainly covers issues related to the GIL, but these affect parallelism not speed. Much performance critical code is already in mainly C (eg. regex handling, XML parsing) with Python-specific overhead such as loops and function calls around 50% slower than C. This is good enough for the likes of CERN.
That said, the changes to the language as part of the move to Python 3 did slow things down, especially switching text from bytes to unicode. Things have improved with each release since about 3.4 but 3.11 is noticeably faster in general use.
Better support of multiprocessor architectures and asynchronous work are required to take advantage of modern CPUs, which are no longer really getting faster. Some of this is really hard to do but the improvements in both since Python 3.6 are impressive. Asyncio is getting more and more popular and better as a result.
Criticiscms of Python's speed are rarely grounded in fact. Yes, as an interpreted language it has an overhead in execution speed when compared with a compiled one: every run requires compilation; higher memory use, etc. However, in many situations where lots of work is being done, the work is being done by C, C++ or even Fortran libraries with the Python code just moving the data around. And this is the recommendation for anyone writing large systems in Python: prototype, profile and then write relevant extensions. It doesn't work in every situation but it works more often than not.
Bloomberg is onboard because the financial services industry has adopted Pandas and Jupyter over Excel, never known for its speed, en masse and Python has become the goto environment for machine learning.
The comparison with PHP is really limited to web stuff. It is the large number of great domain specific libraries (from biology to engineering to machine learning, etc.) in Python that make the difference.
The terms "black holes", "big bang" and "plate tectonics" were all coined derisively in response to observations that turned out to be correct*. The theories and the supporting maths all came later and have since been validated by more observations.
* Well, better or more suitable than previous ones.
if you assume light dissipates its energy as it travels
We don't. Relativity pretty much proved that this isn't the case and lasers are pretty good example of light as a beam. The red shifts observed by Hubble et al, eventually led to the "big bang" theor. Observations since then suggest that expansion is accelerating so that galaxies will eventually achieve "escape velocity" from each other. However, since we don't know what is driving the accleration, we don't know if it's permanent meaning at least two outcomes are possible: heat death through unchecked expansion; collapse back into nothingness if gravity regains the upper hand.
But lots of other things are possible as we simple don't know enough.
Slowly but shortly, the open-source community is becoming a kinder, gentler, and better one.
I think it's just getting bigger and more varied. Lots of people turn into arseholes when doing tasks to which they are not suited or when things are done the way they like. Mailing lists expose this but it goes on all over the place. And women can be just as bad as men when it comes to hurting their colleagues.
As for codes of conducts: in my opinion they are passive-aggressive manifestos from the jobsworth brigade.
Or, a new twist on the old phrase that helped Reagan win in 1980:we're from Google and we're here to help.
Elsewhere: call quality can't get any better, cameras are already fantastic. Bascially, the phone is done but our customers, the device makers, need our help to convince you to keep buying new phones…
First of all, 5G is largely marketing blurb. The real change was to packet based infrastructure via LTE / 4G.
The research is reasonable but the chart has an uptake of 5G that is significantly faster than any comparable changes in the past. That raises a few questions.
Again marketing is in the lead here. WiFi 5 & 6 along with "5G" are driving convergence for wireless technologies: my phone already has the option to use a WiFi connection for calls where available, which is a good way to boost call quality in buildings where cell coverage will naturally be lower. Convergence should effectively increase cell density by blurring the boundaries. Higher cell density is really the only way to improve speeds for everyone.
I think there's a classical trade-off here. Once e-mail became ubiquitous it became a flood and people started looking for alternatives. Messaging is great for small groups and small things but can quickly become much more difficult to manage as messages grow.
What I do like about some of the messaging systems is support for conversational interfaces via bots/agents. The new Telegram API looks very promising in this regard. But interoperability and data sovereignty will remain a problem.
Messaging is complementary to e-mail and can indeed be useful. However, you quickly enter area of incompatible protocols, data privacy and security and what to do with all the data if you want to change providers. Moreover, arranging appointments gets more difficult the more people you try to include. This is cumbersome anywhere where "I can't do Tuesday, Bob can't do afternoons, …" comes up. Conversational interfaces for calendars are the best solution, and it would nice to see better protocols for this. But this still won't resolve all scheduling problems!
That's the same false equivalence of the paper that volume equals competition. The paper, however, says fewer apps but that does mean there is no competition, just fewer competitors. And in many markets there is an optimal number of competitors: adding more just means spreading resources more thinly. There are plenty of examples of this: TV channels, mobile phone networks, etc.. Oh, and all these markets are regulated.
The paper does some quantative analysis to come up with some qualitative results. This is flawed methodology but beloved by some economists who find it difficult to get other data. There is also the flawed premise that more is better. In fact, in many markets this leads to an equilibrium known as "more of the same", ie. less information because there is less profit to be shared.
But the biggest problem with the conclusion is that is regurgitates Silicon Valley's most pernicious mantra: regulation is evil and the best markets have the least regulation.
There are plenty of examples that this is not only untrue but that proper regulation provides more choice and lowers costs. For example, safety requirements in road vehicles are mandated. Not only does this mean fewer road deaths but it also, in many countries at least, reduces the risk to companies because the state indemnifies them to a certain degree if their products meet safety standards. However, modern investors only seem to care about profits today because if a company has any problems they can just sell their stock; it's the inverse of the principal / agent problem. Shareholders are not liable so the worst that can happen to them is company bankruptcy. That might sound bad, but losing equity is a damn sight cheaper than having to accept liability.
I've no beef with BASIC – it introduced me to programming when the alternative was ALGOL. VisualBasic less so. But Access did have lots of problems being yet another nearly SQL implementation, with the nearly part confirming the "miss is as good as a mile" aphorism: data corruption was just too easy and it was too closely tied to MS Office and the Windows Registry.
Everyone was crying out for a browser that worked well overall for the majority of people
I don't remember customers crying out for a new browser. Post-2004 the web was made for IE and, despite the best efforts of Opera and Mozilla, nobody seemed to care. Chrome didn't really take off until Safari did via the I-Phone. Both companies were keen on a replacement for Flash for multimedia, What was that you were saying about facts?
I'm big admirer of what Google has achieved with Chrome and how, despite relative dominance, they continue to play reasonably fair in the development of the web. But make no mistake: Chrome is very much a means to an ad-dominated ends.
ARM is already more profitable than it was when it was bought. But firm valuations are driven largely by the cost of capital and this is getting more expensive as interest rates rise and this is probably the reason for selling in chunks: some money now in the hope of more when interest rates fall again.
Oh, I use adblockers all the time and also find it despicable that paying subscribers are also supposed to be subjected to ads.
But that doesn't mean that Javascript per se is evil. For example, being able to validate forms during completion is a huge usability benefit. OK, arguably this is an example of one of the many deficiencies in forms, but current implementations require JS for real time feedback.
Well, you can do that if you like but some of the stuff is useful and nice to look at. Life was simpler before there was TV, too.
If done correctly, the performance "hit" isn't significant. There's a huge difference between something like HTMX and many of the multi-MB "frameworks" that some lazy people favour.
I'm all for unions and worker representation. No buts, no qualifications.
However, in the US, unions tend to favour closed shop arrangements and are often too close to politicians. For example, in the recent pandemic teachers unions in Chicago and California forced polticians into keeping schools closed, even though it was already known that this was extremely detrimental for many children. And, as data from Florida and elsewhere (and I think de Santis is a tool) showed, they were themselves at a lower risk of infection in school than out, and also a low risk of infection for teachers.
US tech companies love talking about the benefits of share options, et al. but love extending the working day even more. Normalised data on US productivity casts a dim light on this.
And, why is ENDING (apparently) POLITICALLY MOTIVATED BANS *BAD* for them?
At a guess, because they were being pressured by companies that advertise with them, but also by some of the "opinion leaders" that make the platform popular in the first place. Twitter doesn't offer the kind of granular demographic data that Facebook does that attracts the kind of niche (and political) adverts for whichever self-selecting minority is out there: "woke", Nazi, whatever.
Oh great. So "reasoned analysis" can be done in 280 characters, can it?
I don't mind message limits to encourage focus, which in turn might stimulate interest, but they are not suitable for serious discussion. Not that you can't thrash ideas out over a thread. But, again, this is hardly something Twitter has a monopoly on, which is possibly why it's struggling to generate sufficient revenue from advertising.
it's difficult to argue that it doesn't have a signifant effect on today's world.
It is very easy to do just this. While the media love the shitstorms and the sense of immediacy, they're just part of the echo chamber: quotes on Twitter are easier to get than interviews.
For millions of people who do, sigh, take social media serioulsy, Twitter has long been replaced by the more visual media of Instagram, YouTube and TikTok, which is why that's where the money is: why bother risking money on advertising when product placement is possible.
2FA is probably the easist approach. Network analysis is good to an idea of the scale of the problem but not so good for enforcement. Also, lots of people use VPNs because of geofencing.
Cheaper plans with ads will no doubt bring some new users, but then you are competing directly with whichever ad-funded systems are already out there.
I think the biggest problem that streaming services are facing is that creating lots of good content is hard. A few years ago, Netflix was on its own in showering cash on productions, in a way that HBO did 30 years ago. Now there is much more competition for not much more creative talent.
With its own silicon it looks like Apple is increasing the range of options: MacBook Airs won't be full of oomph but will have more hardware acceleration to make battery life better, which is a big selling point.
But for the highend notebooks there can never be enough power. It means more VMs and faster compile/test stages.
This isn't about copyright because LinkedIn is specifically not trying to enforce it. Instead it wants to be able to pick and choose which sites can make use of the data without entering into individual copyright agreements.
Of course, if they do want to go with copyright then they might have to revisit the agreements they make users sign.