Hacker Newsnew | past | comments | ask | show | jobs | submit | teleforce's commentslogin

Fun facts, Gibraltar was named after Tariq ibn Ziyad, a famous muslim Berber commander of the Umayyad Caliphate that conquered most of the Spain and some part of French territories in the early 8th century CE [1].

Then after the conquest, came the exiled young Umayyad prince (escaping from by the later Abbasid Caliphate), who settled in Spain to create a long lasting around 800 years (that's more than European living in America now) muslim Spanish empire with its knowledge center in Toledo. This center contains many books translations and also many new books by muslim scholars. Famous books examples including Almagest Arabic translation that was copied and translated further into Latin, and studied by Copernicus and Galileo [2]. Of course they are other muslim astronomy books and ideas that Copernicus and Galileo studied and copied but never cited properly [3].

Another famous book is Muqaddimah by Ibnu-Rushd or Averroes that's widely considered as the very first work dealing with the social sciences of sociology, demography and cultural history [4].

This center was later captured in 11th century CE, and this event essentially started the Western Renaissance movement in Europe.

Legend has it, in order to motivate his troops, Tariq ordered to scuttle their entire ships armada, before advancing into Spain [5]. Perhaps some of the sinked ships are part of Tariq's original armada, but these ships were intentionally sinked not by accidents.

His act of bravery were copied and followed by later Spanish conquerers but as usual it's not been properly credited to Tariq's original efforts [6].

[1] Tariq ibn Ziyad:

https://en.wikipedia.org/wiki/Tariq_ibn_Ziyad

[2] Galileo's handwritten notes found in ancient astronomy text (42 comments):

https://news.ycombinator.com/item?id=47263938

[3] Islamic Astronomy and Copernicus [pdf]:

(https://www.tuba.gov.tr/files/yayinlar/bilim-ve-dusun/TUBA-9...)

[4] Muqaddimah of Ibnu Khaldin:

https://en.wikipedia.org/wiki/Muqaddimah

[5] The Legend of Tariq ibn Ziyad and the Burning of Ships:

https://arabic-for-nerds.com/islam/conquest-andalus/

[6] Richard A. Luecke - Scuttle Your Ships before Advancing: And Other Lessons from History.


> 800 years (that's more than European living in America now) muslim Spanish empire with its knowledge center in Toledo

The Muslim dominion of the Iberian Peninsula did not last 800 years. The Muslim invasion started in 711 CE, and by 1085 Toledo has fallen back to the Christian kingdom of León. Granada would eventually be conquered in 1492, but most of the old Visigothic Kingdom was already in the hands of the Christians.


711 AD to 1492 AD is a good 781 years.

But as I said above, that is only true for Granada, not for the rest of what it would become Spain.

> This center was later captured in 11th century CE, and this event essentially started the Western Renaissance movement in Europe.

Islamic contribution within the context of European history should be both acknowledged and recognized as being autoctonous, but attributing to it things that well attested through other pathways works against it and reinforces myths historians are toiling to get rid of.

The Renaissance as we know it was kickstarted by the conquest of Constantinople in 1204 by the French and Italians, that's well documented and broadly agreed on by historians. All of this happened on the foundations laid down from the 11th c. onwards as the post-Carolingian world was stabilized.


It's not like Tariq ibn Ziyad invented the concept of intentionally making a retreat impossible in order to compel soldiers to fight. There are proverbs about this kind of thing that predate him by centuries: https://en.wiktionary.org/wiki/%E7%A0%B4%E9%87%9C%E6%B2%89%E... It's probably a popular story to tell because it raises the stakes and provides for dramatic tension: either the battle is won or the army will be annihilated. But I suspect there've been quite a few unlucky commanders who tried this, got annihilated, and never had their heroism praised in history books.

You can use this technique during job interviews by bringing your own padlock and employment contract, and a rope just in case.

Why a pad lock and rope? Are you hoping to lock yourself in?

>Have you ever daydreamed about talking to someone from the past?

Fun facts, LLM was once envisioned by Steve Jobs in one of his interviews [1].

Essentially one of his main wish in life is to meet and interract with Aristotle, in which according to him at the time, computer in the future can make it possible.

[1] In 1985 Steve Jobs described a machine that would help people get answers from Aristotle–modern LLM [video]:

https://youtu.be/yolkEfuUaGs


The idea of talking to a machine that has all of humanities knowledge and gives answers is older than electronic computing. It certainly wasn't a novel idea when Jobs gave that speech. At that time, the field of artificial intelligence was old enough to become US president.

Also, using natural language to interact with digital computers has been a research goal since the advent of interactive digital computers. AI in the 80s tried to do this with expert systems.

With the current crop of LLMs, you could argue it's now a solved problem, but the problem is nothing new.


Solved in the sense that the core idea has been realized but unsolved in the sense that it isn't the sort of safe, reliable, deterministic interaction that was commonly envisioned.

>Aristotle

As a snake oil seller, heh, I woudn't expect something better from Jobs. A competent and true programmer/hacker like Knuth and the like would just want to talk with Archimedes -he almost did a 0.9 version of Calculus- or Euclid, far more relevant to the faulty logic and the Elements' quackery from Aristotle.


Except... not at all? The vast majority of the training data required to create an artificial Aristotle has been lost forever. Smash your coffee cup on the ground. Now reassemble it and put the coffee back in. Once you can repeatably do that I'll begin to believe you can train an artificial Aristotle.

Also none of Aristotle’s exoteric works is extant. All we have are dry, boring lecture notes. Cicero said his public works were a “golden stream of speech” and its all lost. So I don’t see how you’d build an artificial Aristotle when we don’t have any of his polished works meant for the public surviving. Plato would be a better option, since his entire exoteric corpus is extant.

Your bar is too low. With the coffee cup, you at least have access to all the pieces - in theory, although not in engineering practice. With Aristotle, you don't have anything close to that.

Recreating Aristotle in any meaningful way, other than a model trained on his surviving writing of a million or so words, is simply not possible even in principle.


That's easy! All you have to do is simulate the whole universe on a computer, and then go the point when Aristotle is lecturing. Record all his works, then ctrl-c out of that and then feed those recordings into the LLM's training data. For the coffee, you just rewind the simulation and ctrl-c and ctrl-v it at the point you want.

> simulate the whole universe on a computer

Of course in principle that computer only has to be 1.x times larger than the universe, where x > 0. Perhaps AWS can sell you the compute.


Fuck why didn't I think of that all those other times I fucked up in my life. Ctrl-z woulda done it every goddamn time.

OK I'll raise the bar--make sure when you reassemble the coffee cup and put the coffee back into it, the coffee is the exact same temperature as when you threw the whole shooting match onto the floor ;)

EDIT: and you don't get to re-heat it.

EDIT AGAIN: to be clear, in my post above (and this one) by "put the coffee back in" I meant more precisely "put every molecule of coffee that splashed/sloshed/flowed/whatever out when the cup smashed back into the re-assembled cup" i.e. "restore the system back to the initial state". Not "refill the glued-together pieces of your shattered coffee cup with new coffee".


Ah ok sorry, so you want them to fully reverse entropy. I agree that bar is high enough.

Yeah I think if you could pull off a trick like that you could probably recover the necessary training data ;)

Imagine aiming for Aristotle and landing on Siri…

> I see hardware as being a thing for the second world and unlikely to stage a big comeback.

I cannot disagree more.

Actually the synergy of software and hardware (primarily due to the increasing popularity of electromagnetics EM spectrums sensing like Radar/LIDAR/mmWave/THz/etc compared to sound) will create unprecedented beyond human perception and intelligence embodied and enhanced by physical AI. Heck the EXG sensings including ECG/EMG/EEG/etc that are technically part of EM, are now generating hundreds of papers/patents/articles everyday in which this product/patent/paper by Meta and its subsidiary CTRL-labs is only the tip of the iceberg [1],[2].

Please check my other comments for more contexts.

[1] A generic non-invasive neuromotor interface for human-computer interaction (Nature article):

https://www.nature.com/articles/s41586-025-09255-w

[2] Meta Ray-Ban Display (2025 - 962 comments):

https://news.ycombinator.com/item?id=45283306


Not to mention the various manufacturing nationalisation initiatives by the USA, EU, etc. And while it's a scant hope after Covid, maybe American investment culture will calm down and software engineering ceases to be so overvalued.

This is my recent comments on the new RF System-on-Module (SoM) assemblies [1].

If you want to venture or pivot into RF, especially from software background this is the golden time that's made possible/feasible by software-defined radio (SDR) technology as mentioned in the OP article.

One very important thing that the article did not mention is the emerging and increasing popularity of physical AI [2]. RF can be the crucial enabler to to further enhance human limited sensing capabilities with EM based waveforms. A simple analogy is how the dog's powerful smelling capabilities is helping/enhancing human detection capability.

Rather than just training and inferencing on image based I/O, the physical AI now can feed on the much richer RF, mmWave, THz and LIDAR raw waveforms. The good news is that the latter processing of mmWave, THz and LIDAR, can be greatly enhanced by the former lower RF baseband (modulated information signals) that's not previously possible/feasible.

[1] Comments on "ADSY1100-Series: RF System-on-Module Assemblies":

https://news.ycombinator.com/item?id=47821336

[2] What is physical AI?

https://www.ibm.com/think/topics/physical-ai


I've the BTwin Ultra Compact by Decathlon and I'd recommend it as alternative to the popular Bromptons [1].

It cost less than half of the equivalent Bromptons bike that's featured in the article.

[1] BTwin Ultra Compact 1 Second Light:

https://road.cc/content/review/btwin-ultra-compact-1-second-...


>Digging a ditch strengthened the forest cover for his flanks

>The Mughal position was again fortified with a ditch and wagons linked by chains and the matchlockmen, placed in the front of the force, ‘broke the ranks of the pagan army with matchlocks and guns like their hearts’; they were black and covered with smoke. The Mughals had only about 12,000 troops at Kanua, whereas the Rajputs, allegedly, had 80,000 cavalry and 500 elephants

Digging the ditch during the battle is a typical and signature Persian war technique.

Not trying to be pedantic but the more correct word to use here is probably trench. The trench is called Khandaq in Persian and Arabic, the latter most probably a borrowed word from the former.

The main idea is to pre-emptively dig a trench beforeva battle just enough to prevent the enemies cavalry horses from jumping across.

It's succesfully used by early Islamic force against the much larger Meccan Quraish army including their allies during the famous Khandaq war in defending Yathrib (now Madinah) [1]. The idea was suggested by Salman al-Farisi, a Persian companian of Muhammad [2].

Fun facts, Mughals palace households were mainly speaking Persian language, and the Hindi/Urdu language is heavily influenced by the Persian moreso than Arab.

[1] Battle of the Trench:

https://en.wikipedia.org/wiki/Battle_of_the_Trench

[2] Salman the Persian

https://en.wikipedia.org/wiki/Salman_the_Persian


> Mughals palace households were mainly speaking Persian language

They actually spoke Chagatai Turkish in the households and Persian was the court language. Urdu developed independently, mostly outside the royal patronage, and much later.


Thanks for the wonderful Wikipedia excursion I just enjoyed, I learned a lot.

Just wondering why you considered DAC cables cheating, is the analog magic mainly the impedance matching or I'm missing something?

DAC cables are cheating because due to the extremely short range limits (5m, 7m if you're very lucky) they can just put the 10Gbase-R/SFI signal straight on a pair of Copper at 10.3125 Gbaud.

10Gbase-T, to try to get to 100m, throws FEC on it and converts the signal to 4x PAM-16/THP at 800 Mbd, and then uses 4 copper pairs *bidirectionally*. That's the analog magic.


>I'm personally of the opinion that "graph databases" should be relational databases; the relational model can subsume "graph" queries

The MIT team which is also part of the team that proposed GraphQL already kind of solved this problem using D4M [1].

Essentially D4M is able to universally represent relational, graph, spreadsheet and matrices using the mathematic technique of associative algebra, they even wrote an entire book on the subject [2].

It's beyond me why they didn't push forward for D4M but instead propose a limited GraphQL. It seems that they're restricting the D4M capability and focusing on query language like GraphQL, perhaps due to the industry demand and bias.

Heck the same team also proposed TabulaROSA, a new database OS as the more efficient alternative for conventional file-system based OS like Unix/Linux [3].

[1] D4M: Dynamic Distributed Dimensional Data Model:

https://d4m.mit.edu/

[2] Mathematics of Big Data: Spreadsheets, Databases, Matrices, and Graphs:

https://mitpress.mit.edu/9780262038393/mathematics-of-big-da...

[3] TabulaROSA: tabular operating system architecture for massively parallel heterogeneous compute engines:

https://www.ll.mit.edu/r-d/publications/tabularosa-tabular-o...


As mentioned in the article, these devices can "help designers of advanced electronic systems in communications, electronic test and measurement, electronic warfare (EW) and radar systems to speed these devices to the production line".

More specifically, these devices can revolutionize electromagnetics or EM waves signals processing and controls. For examples at RF and microwave, these devices operating frequency can cater for both the baseband frequency (signals containing information) and the carrier frequency (higher energy signals to carry the lower energy signals containing information). For RF and microwave the baseband signals are in MHz ranges or lower, while the carrier signals in GHz ranges [1],[2]. RF and microwave are the basis of the wireless comnunication (Wi-Fi and 4G/5G) and radar (weather and military) technology.

For higher signal in milimeter waves or mmWave (30 GHz - 300 GHz), these devices can help directly process and control the baseband signal components but not the carrier frequency (yet). New wireless 5G standards (FR2) utilizes mmWave to increase the baseband information bandwidth but mostly operating in lesser coverage range in urban environment, for examples micro base stations. Although these devices cannot cater for the higher carrier frequency operating at mmWave, they can process and control its relatively lower baseband signals at GHz range. The same applies to THz based waves (0.1 THz - 10 THz), and the baseband signal is in the order of GHz. The same also applies to the photonics based waves in the infra-red and visible light spectrum (30 - 750 THz).

If you're wondering why this is game changing, it's because in analog domain it's very difficult to reliably process and control KHz signals, and basically impossible for MHz (10^3 times higher than KHz) and (10^6 times higher). The processing control of baseband signals is crucial since they control the information (entropy) part of the signals.

With proper processing and controlling many things are possible, one of the very fundamental and important examples is the impedance matching (part of my research thesis). In analog domain it's impractical to provide wide bandwidth impedance matching due to the Fano's limit (physics based limitation) [6]. However, using these devices we can artificially overcome this limitation by performing the process in digital domain by controlling the baseband information in very wide bandwidth (up to GHz). This bandwidth is only limited by the performance of the state-of-the-art digital and mixed-signal electronics provided these types of devices like the ADSY1100 (or more accurately its main engine AD9084 Apollo MxFE 20 GSPS signal converter).

Of course the impedance matching is the most important or killer application, but new capabilities in processing these electromagnetics or EM waves can lead to other important discovery of novel applications for example high accuracy detection of land mines, cancer, earthquake victims under the rubble, etc [7],[8].

[1] Radio spectrum (RF):

https://en.wikipedia.org/wiki/Radio_spectrum

[2] Microwave:

https://en.wikipedia.org/wiki/Microwave

[3] Extremely high frequency (millimeter waves):

https://en.wikipedia.org/wiki/Extremely_high_frequency

[4] Terahertz radiation (decimillimetric waves):

https://en.wikipedia.org/wiki/Terahertz_radiation

[5] Photonics:

https://en.wikipedia.org/wiki/Photonics

[6] Fano limits on matching bandwidth:

https://ieeexplore.ieee.org/document/1532554

[7] Scientists invent new way to detect skin cancer:

https://www.bbc.com/news/articles/c9wzj1m3g4no

[8] At least 15 still alive under Bangkok skyscraper rubble, rescuers say:

https://www.bbc.com/news/articles/c4gpgylq0qno


The latest edition (3rd Edition) of Introduction to Compiler Design by Mogensen has new topics on SSA form, garbage collection, polymorphism and translation of pattern matching [1].

[1] Introduction to Compiler Design:

https://link.springer.com/book/10.1007/978-3-031-46460-7


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: