Discussion:
[Link Posting] CIA archives document Agency's decades of ASCII woes
Add Reply
Rich
2018-07-07 19:41:09 UTC
Reply
Permalink
Raw Message
####################################################################
# ATTENTION: This post is a reference to a website. The poster of #
# this Usenet article is not the author of the referenced website. #
####################################################################

<URL:https://www.muckrock.com/news/archives/2018/jul/03/cia-ascii/>
In the ?60s, the US federal government saw a need for a unified standard
for digitally encoding information. Lyndon Johnson's 1968 executive
order on computer standards directed federal agencies to convert all of
their databases to the new character encoding standard: the American
Standard Code for Information Interchange, or ASCII.
Although more powerful and flexible standards have since appeared - most
notably Unicode, created to enable people to use computers in any
language - ASCII became ubiquitous, and remains foundational to
computing. It was the most popular encoding on the web until 2007.
The new requirement applied to all federal agencies, including the
Central Intelligence Agency. At first the Agency had no objections. In a
November 1965 letter to the Secretary of Commerce uncovered in CREST,
Director William Raborn signalled the CIA's support of the
standardization effort.
"The basic idea is a good one," wrote Raborn with a hint of reserve,
foreshadowing the decades to come.
Three years later in 1968, when Johnson's executive order went into
effect, the bureaucratic wheels started grinding.
One remarkable memo, written only six months after the executive order,
provides an early example of the way technically-skilled people engage
in institutional power struggles in the era of computers. In it, an
agency employee assesses an ASCII-related proposal by the National
Bureau of Standards, and finds it wanting. null
"Working technicians," the memo's author continues, "would not take the
standard seriously except through excessive and unwise coersion (sic)."
Twenty years later ASCII was still a source of grief. In October 1987,
the CIA's Foreign Broadcast Information Service office in Key West
reported on their role as "guinea pig" for the transition to ASCII - "a
long and frustrating month for everyone involved."
...
RS Wood
2018-07-08 18:53:52 UTC
Reply
Permalink
Raw Message
In the ?60s, the US federal government saw a need for a unified standard
for digitally encoding information. Lyndon Johnson's 1968 executive
order on computer standards directed federal agencies to convert all of
their databases to the new character encoding standard: the American
Standard Code for Information Interchange, or ASCII.
A maddeningly vague article, though interesting. Didn't find much that said
specifically what their issues were. Interestingly, they mention having to
revert to Baudot, which I hadn't heard of. Here it is:

https://en.wikipedia.org/wiki/Baudot_code

Doesn't seem much different than any other code page, unless I'm missing
some nuance. It does give you a sense of the age/vintage of technologies
that ASCII and friends began replacing. Pre-teletype, wow.

Unicode gets my vote for one of the most useful things we've implemented in
the last 20 years. My modern computer is bigger and faster than my old 32
bit, 128 MB RAM, 4GB hard drive machine, yet I use it for fundamentally the
same tasks. But unicode solves a lot of problems I used to come across
writing and reading multiple languages in the early aughts.
Andy Burns
2018-07-09 06:41:05 UTC
Reply
Permalink
Raw Message
Post by RS Wood
hey mention having to
https://en.wikipedia.org/wiki/Baudot_code
Doesn't seem much different than any other code page
Well there's the figure-shift/letter-shift codes that swap between the
two 'halves' of the character set for a start
Richard Kettlewell
2018-07-09 08:06:07 UTC
Reply
Permalink
Raw Message
Post by RS Wood
In the ?60s, the US federal government saw a need for a unified standard
for digitally encoding information. Lyndon Johnson's 1968 executive
order on computer standards directed federal agencies to convert all of
their databases to the new character encoding standard: the American
Standard Code for Information Interchange, or ASCII.
A maddeningly vague article, though interesting. Didn't find much
that said specifically what their issues were.
There’s some specifics in the memo although indeed it’s not clear
exactly how they screwed up the Baudot->ASCII translation in 1988.

From the 1968-09-27 memo:
#2-5 is essentially complaining about the use of a 7-bit encoding when
they have lots of 8-bit data to move around. That was still an issue
into at least the 1990s.

From the 1968-09-25 memo:

#1-#3 complains that most of their computers use EBCDIC (and #7-#8
predict that this will continue to be true, which turned out to be
wrong) and they don’t see the point converting EBCDIC->ASCII->EBCDIC
for interchange.

#4 complains that there is no standard way to represent 7-bit ASCII in
an 8-bit word. I struggle to see why that’s an issue, really; there’s
a natural mapping from the ASCII codes to integers and presumably they
were already dealing with 8-bit integers routinely. I think this is
just foot-dragging l-) #5 is the same complaint for storage devices.

#6 complains that they have to rewrite sorting/comparison routines
that previously assumed EBCDIC. That seems like a legitimate gripe,
although given how weird EBCDIC is they’d probably find the result
simpler than the original...
Post by RS Wood
Unicode gets my vote for one of the most useful things we've
implemented in the last 20 years. My modern computer is bigger and
faster than my old 32 bit, 128 MB RAM, 4GB hard drive machine, yet I
use it for fundamentally the same tasks. But unicode solves a lot of
problems I used to come across writing and reading multiple languages
in the early aughts.
Yes.
--
https://www.greenend.org.uk/rjk/
Huge
2018-07-09 09:37:27 UTC
Reply
Permalink
Raw Message
Post by Richard Kettlewell
#1-#3 complains that most of their computers use EBCDIC (and #7-#8
predict that this will continue to be true, which turned out to be
wrong) and they don’t see the point converting EBCDIC->ASCII->EBCDIC
for interchange.
Not that you can. There are characters in the EBCDIC set that aren't in
ASCII.
--
Today is Setting Orange, the 44th day of Confusion in the YOLD 3184
~ Stercus accidit ~
Huge
2018-07-09 09:36:04 UTC
Reply
Permalink
Raw Message
Post by RS Wood
In the ?60s, the US federal government saw a need for a unified standard
for digitally encoding information. Lyndon Johnson's 1968 executive
order on computer standards directed federal agencies to convert all of
their databases to the new character encoding standard: the American
Standard Code for Information Interchange, or ASCII.
A maddeningly vague article, though interesting. Didn't find much that said
specifically what their issues were. Interestingly, they mention having to
revert to Baudot, which I hadn't heard of.
You've obviously never been a radio ham or involved with the early
days of Telex, especially SITOR (something something teleprinter over
radio)
Post by RS Wood
https://en.wikipedia.org/wiki/Baudot_code
Doesn't seem much different than any other code page, unless I'm missing
some nuance.
Baudot is only 5 bit ...
--
Today is Setting Orange, the 44th day of Confusion in the YOLD 3184
~ Stercus accidit ~
RS Wood
2018-07-10 00:06:34 UTC
Reply
Permalink
Raw Message
On 9 Jul 2018 09:36:04 GMT
Post by Huge
Post by RS Wood
A maddeningly vague article, though interesting. Didn't find much that said
specifically what their issues were. Interestingly, they mention having to
revert to Baudot, which I hadn't heard of.
You've obviously never been a radio ham or involved with the early
days of Telex, especially SITOR (something something teleprinter over
radio)
... resisting .. the ... urge .. to crosspost to uk.radio.amateur ..... Never!
Huge
2018-07-10 08:37:49 UTC
Reply
Permalink
Raw Message
Post by RS Wood
On 9 Jul 2018 09:36:04 GMT
Post by Huge
Post by RS Wood
A maddeningly vague article, though interesting. Didn't find much that said
specifically what their issues were. Interestingly, they mention having to
revert to Baudot, which I hadn't heard of.
You've obviously never been a radio ham or involved with the early
days of Telex, especially SITOR (something something teleprinter over
radio)
... resisting .. the ... urge .. to crosspost to uk.radio.amateur ..... Never!
Please don't. For one thing, I killfile crossposts to the cesspit.
--
Today is Sweetmorn, the 45th day of Confusion in the YOLD 3184
~ Stercus accidit ~
Rich
2018-07-09 11:20:24 UTC
Reply
Permalink
Raw Message
Post by RS Wood
In the ?60s, the US federal government saw a need for a unified standard
for digitally encoding information. Lyndon Johnson's 1968 executive
order on computer standards directed federal agencies to convert all of
their databases to the new character encoding standard: the American
Standard Code for Information Interchange, or ASCII.
A maddeningly vague article, though interesting. Didn't find much that said
specifically what their issues were. Interestingly, they mention having to
https://en.wikipedia.org/wiki/Baudot_code
Doesn't seem much different than any other code page, unless I'm missing
some nuance. It does give you a sense of the age/vintage of technologies
that ASCII and friends began replacing. Pre-teletype, wow.
The nuance I took away (granted, this is in part formed by knowledge of
how difficult it can be transitioning govt. agencies off of existing,
anchient, tech and into something new [to them, not to the rest of the
world]) was that in typical govt. fashion, they had no idea of how
many were sending them data, nor of how many others consumed their data
later. So when they 'updated', they discovered post cutover that 38%
of the transmitters were still sending badot because they had never
been identified as a transmitter and never told of a need to upgrade,
22% were sending badot because although they were told, they had no
budget allocation with which to update and so they did not, but did not
communicate that critical fact back until the day things went haywire,
10% could not read a spec. and update to actually meet the spec, 5%
forgot to accomodate time zones in their 'start' calculations, so they
were 'updating' at 12 midnight local time, not 12 midnight zulu time,
leaving only 25% of the transmitters who actually started sending ASCII
properly at the correct appointed time.

With a nearly identical (just different percentages) situation for all
of the receivers of the data.
Loading...