The birth rate in Switzerland (just as in many highly developed countries) is already way below 2.1 children per woman, which would be required to sustain the population. Any population growth comes from immigrants. Xenophobic people are scared by that.
Yes? Modern portable computing enables counter-surveillance of police, better communication and knowledge access for dissidents, and interface with institutional computer systems for any number of ends. The George Floyd protests don't happen if the bystanders didn't have smartphones, or if protestors had to carry around an Alienware tower; the Snowden leaks don't happen at the magnitude they did without memory miniaturization. There are international examples, too, and commensurate crackdowns on computing freedom (particularly in Hong Kong).
You've got a supercomputer and a library and a set of video production equipment in your pocket, among other things. The capabilities of such a device are fundamentally different from something that's tethered to a desk or that's conspicuous when out-and-about. The idea of it being open and untrackable is exciting for some and terrifying for others.
Even after Google puts this crap in place, you can still uplodad your own apps to your own Android devices, using ADB. Doing the same for iOS, using Xcode, costs you USD 100 or more (depending on country) per year.
I'm in no way defending Google here, just pointing out you're going from bad to worse and think it's a good thing.
I haven't seen new data from celbrite in awhile, but I believe that grapheneos was the only truly secure phone from it for both bfu and afu as of a couple years ago.
Apple lost my confidence after they removed Advanced Device Encryption for British users (plus implemented age verification for them).
iPhones with Lockdown Mode enabled have definitely been exploited which is confirmed by leaked documents and statements from commercial exploit vendors. Lockdown Mode primarily reduces attack surface in Safari and from Apple services. It does very little to protect against other attack vectors such as messaging apps or physical data extraction.
You're thinking of Apple saying they haven't detected a case of a device with Lockdown Mode exploited in the wild themselves. Extremely few devices use Lockdown Mode and Apple has very little insight into successful exploits so there isn't much opportunity for them to detect it in the first place. Lockdown Mode bundles everything together and has very inconvenient changes many people won't accept. That greatly reduces usage even by people fully aware of it who want a lot of what it provides. For example, there's
Apple has said they haven't seen a case of a device with Lockdown Mode being exploited which is extremely misleading. Apple doesn't have that much visibility into devices being exploited and would mostly seen failed attempts. All of the Lockdown Mode functionality being bundled together contributes to it barely being used. There's no opt-out system for most of it beyond disabling it as a whole. Only a subset of the Safari restrictions can be partially disabled per-app and per-site which doesn't fully restore web compatibility. It's more that hardly anyone is using it and that Apple doesn't have much insight into apps and the OS being exploited successfully in the first place. Lockdown Mode is definitely useful but people should read about what it actually does and compare that to how devices get exploited. Apple's memory corruption exploit protections aren't tied to Lockdown Mode.
How is then law enforcement getting what they need from people's iphones? Because I understand they do, in some way. And I'm not asking about forcing people to hand over pin or fingerprints, but just by themselves.
Lockdown Mode is focused on reducing the attack surface from Safari including the WebView and Apple services including iMessage/FaceTime. It does nearly nothing to protect against non-browser/non-messaging attack vectors in the OS or other apps. It's up to app developers to implement similar restricted modes and also baseline exploit protections. App developers need to explicitly opt-in to using the standard exploit protections used in many parts of the OS and Apple discourages doing it:
iPhone security is a myth. This is because you can't scan iPhone for threats, so Apple can pretend they don't happen. iOS is probably the least secure platform there is thanks to the security by obscurity approach by Apple.
You can use iPhone being blissfully unaware it has malware on it even in Lockdown mode (which is essentially cope mechanism and Apple way of saying "we care about security, trust us bro").
You really think Apple doesn't gather data on what you do on your devices? This notion that Android == spyware is so old and boring but HN just loves Apple.
I'm sure they do collect data but not to the point that they hamper functionality. They still focus first and foremost on usability, functionality whereas Google focus on collecting data, serving ads and then on functionality.
But yeah, there is no doubt in my mind that they both collect as much as they can.
Google gets nearly all of its revenue from targeted advertising, and Apple does not. Apple has an incentive to restrict or completely deny third-party data collection, because they’ve made privacy a major part of their brand marketing and there is major reputational risk to Apple for being caught lying about this. Apple’s “Ask App Not To Track” feature made such a measurable dent in the revenue of various surveillance tech companies that they complained about it, loudly, including Meta paying for a full-page ad in the New York Times about it.
There are multiple objective reasons to believe that Apple is a more trustworthy actor here than other companies, including vulgar capitalistic reasons.
You can just say “pfft, wow, you really believe that?”, I guess, but if that’s your position there’s no reason to argue about this with you.
Apple's ad revenue is growing massively past few years, projected to be 13 billion revenue stream next year. Where do you think those ads are ending up, and do you really believe they are non-targeted? So while your statements are still somewhat valid, not that much and not for that long.
Also, for anybody from outside of US, its US 3-letter agencies that pose biggest actual security risk since US laws treat us as sub-humans. Apple is as translucent to those as Android. But I get it, its still much easier to make PR campaign based on security for Apple than Android.
Can I plug iphone via usbc and access photos and videos directly and rest of the filesystem directly? Thats my flow, I am not buying a phone which has this artificially disabled 'for my own good', while being unix under the bonnet. Insult to my intelligence and all that.
While not equivalent to a true iOS app, PWA is a decent option that allows you to circumvent the app store restrictions. If you are trying to build apps primarily for yourself, it's a decent option.
Actually I have been tinkering with PWA as a way to remake some of my toy apps. Though a lot of the automations I made for Android can be replicated through Apple’s Shortcuts app.
The biggest loss for me was Termux. I had lots of scripts and such that I ran, plus just having a Linux environment in my pocket was nice. Luckily I found ish which gives me alpine Linux on top of a virtual x86 machine as provided by a JITC layer. I can host PWA apps out of that environment for local use. Of course I can also ssh to my unix like machines from there too.
I am starting to tinker with swift a bit more too. As with google, I could buy a dev key to deploy my own apps only this way I have all the window dressing and end to end encryption on cloud storage.
Doesn’t that require you to host it and have it available on the open web, though? Is there a host that allows you to, for free, not only HTML/CSS/JS but also access to arbitrary tools and bespoke scripts on the backend?
For free? No, but if you built a native app that needed a backend, you'd still need to host the backend somewhere too. I host my own web apps from a cheap mini pc at home and access them over tailscale for personal use.
I'm pretty sure that if you build your PWA in a way it works offline through caching (which is easy if it's just a static website), you could host/serve it temporarily and just install it once.
As a lark, I built a set of personal productivity apps that are delivered as standalone local webpages. Works surprisingly well on Android, haven't tested on iOS.
I would like to mention that although I’m aware of the limitations, I think it is worth designing and advocating for web app standards that could even at some point become a viable competitor to native apps, especially for apps that really don’t need to be native/wrapped apps in the first place since most are CRUDs anyways.
Maybe this will be a catalyst towards further evolution of the web app as Android devs want to carve out some freedom from the world domination corporate shadow government walled gardens.
You're not wrong, but it will always be the case that the web platform lags native. There will always be stuff you can't do without a native client. The proportion of apps that it's viable to run as a PWA will probably increase over time, but the platforms have both the ability and incentive to stay out ahead.
Most apps can be a PWA nowadays. A Hetzner VPS costs roughly the same as the Apple dev membership. Saying this as a native iOS dev since iOS 4. For your average pretty json printer you don’t need to go native.
Offline PWA sites are very limited on iOS. If you force close Safari, look at your phone funny, or don’t visit the site regularly, the cache is cleared and you are stuck at a loading screen until you have internet again.
That’s what forced me to finally bite the bullet and pay Apple yearly so I could develop an app for my friends and I to use. Would have much rather kept it as a PWA.
I remember running kali linux once on my phone with (termux+vnc) and a vnc viewer app watching some random youtube videos a few years back
So I feel like, Something like this was/is possible but its immensely hard for something like this being used especially when a desktop os on a phone is so bad ergonomically speaking unless you have a keyboard mouse connected
A better option iirc is to use something like kivy[0] directly with termux, not sure if java might have direct options too or not.
You aren't even limited to android apps. You can install termux and write and compile your own code to run from there or to copy and run anywhere else.
Sorry, even as a developer, "but, you can use ADB" is a big big copout.
What's the next step when ADB requires some hoops to enable? Will we say that but the eMMC has an unencrypted EXT4 partition, we can just desolder and write into it?
As a dev, i'd say having to use adb is a minor inconvenience.
Still unacceptable, a better option would be to use something like lineage or some other aosp distro without the google services (hoping that nothing makes you dependent on them).
This still doesn't address the vast majority of people though (and that's what I'm concerned about the most).
What we need now is:
- short term, work on pushing apps not to depend on the google services so phones preinstalled with something like /e/ become a viable option for most people. Push our public services to stop mandating Google and Apple OSes for random stuff.
- longer term, work on making alternatives to Android and iOS viable options for most people (stability, usability and availability of services people use). The best candidate for that today is Linux mobile.
Breaking network effect around proprietary services is one of the strategies towards this.
Another one is reducing our reliance on computers (of any shape) altogether, maybe.
Technically not but the devil is in the details. Having to reinstall the app every 7 days and a limit of one app doesn’t even pass the bare minimum.
Jolla has a prelaunch campaign, decent phones for 200€. I might just as well grab one. Sick of having a phone which is more expensive than my laptop but I can barely use.
Wait, I can download and run iOS on my own hardware? Not that I have tried, but I always thought Apples whole schtick was you were only allowed to run their software on their latest X revisions of their hardware?
Isn't keeping ADB enabled (most people who do this don't enable it and then promptly disable it) a huge security problem? ADB enabled means an adversary can completely own your device and "back it up" by simply plugging it in.
This is much worse than nagging about "untrusted sources".
Not only is it TOFU but that comment is doubly wrong because you can't really back up much other than the bulk storage directory without adb root (which requires a custom build, which obviates the issue to begin with).
Apple has the same thing, but for some reason added Developer Mode which you must enter on the iPhone first. It’s quite involved, with a restart and 3 confirmation dialogs. That had me wondering why they are suddenly so cautious around this.
>ADB enabled means an adversary can completely own your device and "back it up" by simply plugging it in.
each adb host has to be individually white-listed by an unlocked device. also the current behavior is that it auto forgets any white listed host that hasn't connected within 7 days.
My personal perspective: 2 out of 3 MacBook Pro, I worked with, had expanding batteries after about 5 years. Replacement was a big hassle and the new no-name batteries are nowhere near as good as the original ones.
I sure wish it was as easy as a battery replacement on a Framework laptop (with an original part).
I know the Neo has easier battery replacement (not glued in), but still it has an iFixit rating of 6/10 whereas the Framework 12 has a 10/10.
> We have people who can still do maths well after the introduction of the calculator.
I assume by "do maths" you mean doing simple calculations, like adding a bunch of small numbers, in one's head. That's because in many situations it's more convenient to do so, than using a calculator. So the skill is preserved / practiced, because a calculator is too cumbersome to use. The skills of most people settle at the equilibrium where it takes the same effort to take out the calculator and focus on typing, as it would to strain the brain doing it without a calculator.
> We have people who can still spell after the introduction of spell check.
When using spell check to fix your document, you automatically learn to spell. Your skills improve by using the tool. A better analogy to AI would be an email client with a "Fix all and send"-button, where you never look at the output of the spell checker.
No. These tools are very good at creating illusion of learning, without any learning. When you watch them do stuff, you think, yeah I got this.
Once they are gone, you realize all your supposed skill is gone too. Getting a skill requires deliberate practice. You can use AI for that, but just using AI is not that.
There's an old Latin proverb "Scribere bis legere", which translates to "writing is reading twice".
In practice, what this means is that you can read some subject many times, but you would still struggle to reproduce the content by yourself. That is why, when learning, it is not sufficient to just read the material several times.
I would also argue, that most school system forbid the usage of a calculator the first couple of years (at least that's how it was Germany a few decades ago). The same with writing per hand. You can spell check by looking the word up and then manually correcting it.
Both require manual "labor" which leads to learning.
And calculators took decades to become widespread. So we could learn of their side effects before they became mainstream.
Also to note. Calculators merely solve intermediary steps. LLMs are increasingly designed to do a one shot full blown work. Longer context, deep thinking, agentic loops.
> > We have people who can still spell after the introduction of spell check.
> When using spell check to fix your document, you automatically learn to spell. Your skills improve by using the tool. A better analogy to AI would be an email client with a "Fix all and send"-button, where you never look at the output of the spell checker.
I was in highschool right as spellcheckers were becoming common, and the general consensus among us as students was that they made us worse for exactly that reason: We could just click the spellcheck button and "accept all", so most of us stopped learning the right way to spell words we had trouble with.
My observation is that there's a certain group of programmers who "don't get graphical CAD" (their words) and prefer code based CAD. Now with LLMs, many of them like to spin up their own solution of "text to CAD", "image to CAD" or just another "code based CAD" and then do a "Show HN", praising it as the bee's knees.
When I look at their examples, the objects are usually non-functional, lacking precision and control, and could much more easily be done in a graphical CAD application.
I think some people just don't want to learn graphical CAD and hence are unable to see where their own solution is lacking important features.
It doesn't feel like more ideas are explored, it feels like more variants of the same old things are produced. Ideas have always been hard and AI doesn't help with that.
It feels like people are more willing to give their agent a prompt than search the web for existing solutions.
I've noticed a crazy amount of clearly AI coded projects that do a small subset of an already existing and very trusted open source project. Comments usually point this out, and the OP never responds. I'm not sure what the end goal is, but the whole thing feels like a waste of time for everybody involved.
reply