Brianna Wu / John Walker Flynt - "Biggest Victim of Gamergate," Failed Game Developer, Failed Congressional Candidate

  • 🏰 The Fediverse is up. If you know, you know.
  • Want to keep track of this thread?
    Accounts can bookmark posts, watch threads for updates, and jump back to where you stopped reading.
    Create account
I wish I understood this endian stuff. I hate getting left out of laughing at John. But alas, all my degrees are in the humanities, and so I remain mechanically declined.

At least I have degrees, though, so I'm one up on John. Hurray!
 
Okay. I know just enough about computers that I can build my own and set up the bios.

But...

This isn't about Gulliver's Travels, right? I fail to see what she's actually talking about because all I'm reading keeps sounding like Lilliput...
 
Long story short for the non-CS people (also the opportunity to let my inner sperg out for a bit):

Big endian has the most significant bit sent first, so it goes at the end _______x. This is backwards to the way we "normally" read numbers.

Little endian has the most significant bit sent last, so it goes at the beginning x_______. You can think of this as being the "normal way" of reading numbers, where the highest value is on the left.

Different CPUs and different CPU manufacturers do it different ways, but basically since almost all CPUs are Intel and Intel chose little endian, it's more or less the default now.

There's also middle endian but you really don't need to worry about that.


The joke about Wu's second tweet is that she said Intel went from little endian in 386 chips to big endian in 486 chips, which would basically be impossible if you wanted to maintain backwards compatibility*, so she has no idea what she's talking about.


*Edit: Changing endianness would basically reverse your numbers. So if you sent 01000 to the computer to do something, and then you change endianness, now the computer is reading 01000 as 00010, which is a completely different instruction and nothing that used to work does anymore.
 
Okay. I know just enough about computers that I can build my own and set up the bios.

But...

This isn't about Gulliver's Travels, right? I fail to see what she's actually talking about because all I'm reading keeps sounding like Lilliput...

You have to remember there literally isn't a single subject John actually understands. He couldn't even pass a moron major like journalism in ten years.

Anything he says on any technical subject is a mishmash of misremembered nonsense from some thing he skimmed once years ago and never really understood in the first place. You'd think someone who made an utter fool of himself every time he discussed a subject would learn at some point, but not our John.
 
Long story short for the non-CS people:

Big endian has the most significant bit sent first, so it goes at the end _______x. This is backwards to the way we "normally" read numbers.

Little endian has the most significant bit sent last, so it goes at the beginning x_______. You can think of this as being the "normal way" of reading numbers, where the highest value is on the left.

Different CPUs and different CPU manufacturers do it different ways, but basically since almost all CPUs are Intel and Intel chose little endian, it's more or less the default now.

There's also middle endian but you really don't need to worry about that.


The joke about Wu's second tweet is that she said Intel went from little endian in 386 chips to big endian in 486 chips, which would basically be impossible if you wanted to maintain backwards compatibility*, so she has no idea what she's talking about.


*Edit: Changing endianness would basically reverse your numbers. So if you sent 01000 to the computer to do something, and then you change endianness, now the computer is reading 01000 as 00010, which is a completely different instruction and nothing that used to work does anymore.

That makes sense, but why call it something from a satirical novel from the 18th century?
 
I've been trying to figure out what movie character this picture reminds me of
EbYeelh.gif

I'm pretty sure there's an orc in LotR at one point that looks almost exactly like this, but I can't remember when that one is.
 
That makes sense, but why call it something from a satirical novel from the 18th century?
That part I'll shamelessly copy from Wikipedia.

Danny Cohen introduced the terms Little-Endian and Big-Endian for byte ordering in an article from 1980.[1][2] In this technical and political examination of byte ordering issues, the "endian" names were drawn from Jonathan Swift's 1726 satire, Gulliver's Travels, in which civil war erupts over whether the big end or the little end of a boiled egg is the proper end to crack open (analogous to counting from the end that contains the most significant bit or the least significant bit).[3][4]
 
Why is she like this? Why is it so important for her to pretend to all this software engineering knowledge she clearly has no fucking clue about?
Because Brianna Wu, the character, can't ever be wrong.

Like many other trannies, Wu has formed the basis for her gender identity off of some fantastical, fetishized vision of womanhood (in John's case, it's his SOCCON characters). Basically, Brianna wants to be a real life Mary Sue.
 
Actually I think you being kicked out on your asses is far more believable than you having a Congressional campaign.

rent.jpg
 
Long story short for the non-CS people (also the opportunity to let my inner sperg out for a bit):

Big endian has the most significant bit sent first, so it goes at the end _______x. This is backwards to the way we "normally" read numbers.

Little endian has the most significant bit sent last, so it goes at the beginning x_______. You can think of this as being the "normal way" of reading numbers, where the highest value is on the left.

Different CPUs and different CPU manufacturers do it different ways, but basically since almost all CPUs are Intel and Intel chose little endian, it's more or less the default now.

There's also middle endian but you really don't need to worry about that.


The joke about Wu's second tweet is that she said Intel went from little endian in 386 chips to big endian in 486 chips, which would basically be impossible if you wanted to maintain backwards compatibility*, so she has no idea what she's talking about.


*Edit: Changing endianness would basically reverse your numbers. So if you sent 01000 to the computer to do something, and then you change endianness, now the computer is reading 01000 as 00010, which is a completely different instruction and nothing that used to work does anymore.

What surprises me is that Flynt/Wu hasn't made a big push for everyone to start using bi-endian compilers. The big and little endians are literally killing bi-endians.
 
Back
Top Bottom