Support This Website! Shop Here!

Friday, February 03, 2023

Do AI Essays Violate Copyright?

 I'm not a lawyer, I do not play a lawyer on television, however, there is reason to believe that any essay generated by an AI is not subject to copyright prosecution.

According to the copyright office

Copyright law does not protect ideas, methods, or systems. Copyright protection is therefore not available for ideas or procedures for doing, making, or building things; scientific or technical methods or discoveries; business operations or procedures; mathematical principles; formulas or algorithms; or any other concept, process, or method of operation.

Section 102 of the Copyright Act (title 17 of the U.S. Code) clearly expresses this principle: “In no case does copyright protection for an original work of authorship extend to any idea, procedure, process, system, method of operation, concept, principle, or discovery, regardless of the form in which it is described, explained, illustrated, or embodied in such work.” Inventions are subject matter for patents, not copyrights. 

An AI typically assigns every item (in ChatGPT's example, that would be each word) in its training data set a numeric value. Once it has this string of numbers, the program then uses an algorithm to generate a new string of numbers which are converted back into words. Technically, the conversion and algorithmic operations upon the training data set arguably means the words of the resulting AI-generated essay are not copied, even if the resulting text is identical to one or more of the sections of the training data set. 

Furthermore, since this algorithm is a "business operation or procedure" that uses "mathematical... formulas or algorithms" in a "process, or method of operation", the conversion process is likewise not something that can be copyrighted. The specific code that expresses the algorithm can be copyrighted, but the conversion algorithm itself cannot be copyrighted.

If AI-generated essays turn out not to be subject to copyright infringement due to its algorithmic foundation, that raises an additional question. Why should the algorithmic operation of a machine be privileged over the algorithmic operation of the wet-ware which operates within an individual's brain? We may not fully understand how the brain's algorithms work, we certainly cannot replicate those algorithms, but we can obtain very similar response outputs by using them ourselves. 

So, do copyright and plagiarism rules seem reasonable in the 21st century? As knowledge becomes more of a computer-generated resource, it is hard to see how these ideas can continue to be useful within the culture.

Thursday, January 26, 2023

Problems with AI

ChatGPT and a handful of other social AIs are making a big splash. They are unreasonably good at many tasks and are already used to generate real-world work. However, these systems didn’t get this good based on the code written by the people building them. They got this good by using data strip-mined from the Web — social networking posts, images, etc. — and by employing low-paid ‘AI Turks.’ Recently, it was disclosed that ChatGPT is employing workers in Kenya, making $2 an hour, to train the system (currently valued at $29 billion and climbing fast). 

There are other problems. While its ability to create abstract text is sufficiently advanced to fool experts in the field, the current iteration of ChatGPT frequently lies. It lies about what it can do. It claims it's learning algorithm was cut off in 2021, but it knows Elon Musk is head of Twitter. It creates fake citations when its statements are questioned. It creates fake people to make its statements. It even cheats at Hangman. And everything it does can now be beamed, via AR-augmented contact lenses, straight onto your corneas.

But the problem is not restricted to any of these difficulties. There are deeper problems. For instance, the whole point behind an AI model is that the algorithm self-modifies to improve it's responses. ChatGPT doesn't appear to be doing that. It was trained on a specific information set, however large that information set was doesn't matter. The point is, it doesn't modify itself based on user input and correction. 

Now, this refusal to self-modify was probably a design decision on the part of the builders. They were concerned that maleficent or other mission-oriented parties would modify the information base enough so that the AI would start spewing out incorrect information. But that refusal to allow user-modification assumes the original data set boundaries established by the programmers were themselves correct and reasonably complete. Given its consistent mis-representation of facts, this is demonstrably not true.

Wikipedia is crowd-sourced, but is often skewed or incomplete because the moderators tend to be unemployed, mission-oriented people who choose to skew certain information sets within Wikipedia. Given its responses, ChatGPT clearly comes to us already skewed.

So there is the first issue with AI: should it allow crowd-sourcing? If so, it arguably allows the people with the most free time to modify its information training set. But if its programmers do not allow crowd-sourcing, then the AI does not get some of the self-correcting intelligence of the masses. 

The idea behind democracy is that you cannot fool all of the people all of the time. It is not clear that this conceit is correct. So, how should AI programmers source information to make sure it stays accurate?

And who gets to determine which interpretation of various fact sets is the most accurate? Currently, internet-available AI is really just a set of rules established by anonymous programmers. The rules pattern-match character strings out of a discrete data set, aka, the "training data". The anonymous programmers have pre-defined what character strings may be displayed, and which ones are not permitted to be displayed. The content does not update based on user input.

Since every character string can be, and in fact, is, reducible to a long binary number, the information produced by ChatGPT or any other AI is essentially a number. ChatGPT and AIs like it are simply very flexible electronic calculators, operating on character strings that include not just Arabic numerals but now also alphabets, both of which are represented as binary numbers as far as the computer is concerned.

So, when we talk about whether ChatGPT results should be allowed in the classroom, we are actually ruling on whether the use of certain numbers should be made illegal. If a particular number/essay is generated by ChatGPT, then it is an illegal number, but if it is hand-generated by the student, then it is legal. Is that our position? That sounds a lot like the late-20th century fight over whether math teachers should permit calculators in the math classroom.

Math teachers lost that fight decades ago. Since the 1990s, students have been permitted to use calculators to generate answers to problems which they themselves did not understand how to do. The argument was that math teachers should concentrate on teaching high-order thinking skills instead of having students engage in low-order rote memorization. Unfortunately, as AI has now irrevocably demonstrated, the manipulation of nouns and verbs in a sentence is, like math, nothing more than the application of rote memorization, the memorization of a complicated algorithm and its application. 

Keep in mind, it was not math teachers who advocated to allow calculators in the classroom, it was English and history teachers, who made, and won, this argument. Now that computers apply grammatical and referential pattern-matching algorithms in the English and history classrooms, does this substantially change the original late-20th century argument? Should we allow liberal-arts calculators in the classroom?

Basic math problems, and their solutions, cannot claim copyright or plagiarism protection because they are known to be "common knowledge." Do basic English expressions, even unto whole essays, make a better copyright or plagiarism claim just because the algorithms that produce those expressions are somewhat more opaque? To put it another way, if electronic calculators for math allow students to pursue higher-order math skills more efficiently, then does not an AI-generated essay remove the need for mastery of low-order English skills? Proper grammar, subject-verb agreement, these and similar skill-sets are merely algorithmic solutions which students memorize to solve English and history problems. Should not the student eschew this low-order skillset so as to spend his or her valuable time pursuing higher-order critical thinking skills? 

This is not a new question for these instructors. For at least a decade, it has been pointless to grade a student on proper footnote or endnote citation. The necessary information can be plugged into any number of free foot/endnote generation software, including software built into the word processor itself. Instructors are no longer grading students on such formatting, they are grading the anonymous programmers who wrote the software that formats the foot/endnotes for the student.  The rote memorization about where, exactly, one should place a comma or period to accommodate a specific style has long since gone the way of the dodo. Should subject-verb agreement or discussions of "than vs. then" join footnotes and endnotes on the dustbin of history? 

Plato and Socrates decried the manufacture of books because they understood the written word would destroy their culture's understanding of knowledge. For the ancient Greeks, knowledge was memorized, stored in mnemonic "memory palaces" within one's own mind. To be forced to refer to outside sources for knowledge was a form of intentional self-harm, it weakened the mind. It created the illusion of discourse where there was no discourse. It was virtual reality, the pretense of talking with an absent or dead author, it was not real dialogue with another living human being. 

Most persons are surprised, and many distressed, to learn that essentially the same objections commonly urged today against computers were urged by Plato in the Phaedrus (274–7) and in the Seventh Letter against writing. Writing, Plato has Socrates say in the Phaedrus, is inhuman, pretending to establish outside the mind what in reality can be only in the mind. It is a thing, a manufactured product. The same of course is said of computers. Secondly, Plato's Socrates urges, writing destroys memory. Those who use writing will become forgetful, relying on an external resource for what they lack in internal resources. Writing weakens the mind.”

                                            ~Walter J. Ong, Orality and Literacy: The Technologizing of the Word 

The ancients did not fully understand that the ability to write books opened a much more intricate dialogue with a much vaster audience of dead witnesses, each whispering his or her own life experience and perspective. The written word connected together a vast complex of multi-processed information that could never be matched by any individual's oral knowledge transmission. One man might build a mighty mnemonic palace in his mind, but his singular palace died with him. The written word kept his palace alive, even if as a faded ghost rather than the vibrant original. 

The written word, whether via individual scrolls, letters or books, had many disadvantages, but that same written word was a force multiplier that eventually became the foundation of multiple knowledge revolutions. In the same way, the Internet, as concentrated and distilled through the search engine and its AI progeny, holds promise to multiply knowledge yet again. 

Before 1935, a "computer" referred to a man or woman who could do math rapidly in their heads, sometimes with the assistance of pencil and paper. Thus, the word "electronic" had to be added to distinguish the person from the electronics-based AI that silicon introduced to the math classroom. Now, we use the designation "AI" to distinguish the computer in our pocket from the computing done on machines we cannot see, but whose results fill our screens. Whether we discuss the hand-held AI calculator in the 1980's math classroom, or the 21st-century AI-trained cloud computer, we have teamed with a vast number of anonymous programmers whose algorithms, for better or worse, define the knowledge base we access.

Socrates died in 399 BC, and his distaste for books died with him. For five hundred years, scribes labored over their books and the copying of those books, but for five hundred years, plagiarism was unknown. After all, writing was an esoteric skill, very expensive to develop and maintain. It was the province of the landed aristocrat and the wealthy. An ancient scroll or a medieval book cost as much as a private jet would today. A peasant might be plucked out to be trained as a pilot (in medieval terms, a trained as a priest or cleric, thus writing tasks are still referred to as "clerical"), but the vast majority of people simply couldn't afford the luxury. It wasn't until nearly 100 AD, that the poet Martial used the term to describe how other poets were "kidnapping and enslaving" his words for their own use. "Plagiarism" did not enter modern English until 1601 when Ben Johnson stole Martial's term, bringing the first century distaste for this practice into the modern era. 

It took 500 years for plagiarism to be denounced; it was nearly a thousand years before someone thought to call the copying of books a crime. The first  recorded copyright issue arose in 6th-century Ireland. The king ruled "every cow has its calf, and every book its copy," thereby not only granting copyright to the book's original owner (note: not the author, the owner), while linking the book's expensive parchment pages to the animal from which they were derived. But even so, copyright was not codified into modern law until 1710. Even in that late year, the printing press was not yet two centuries old, and literacy was still an uncommon skill. Plagiarism and copyright are both ideas invented to handle the aristocratic written word. Do they truly apply to the computed alphabet string? 

Although it was not realized at the time, the ideas behind plagiarism and copyright are founded on a peculiar mathematical idea. Since every language expression can be reduced to a number and processed like a number, plagiarism and copyright are founded on the idea that a person can page one by one through the infinite realm of whole numbers and lay private property claims to an individual number within that realm that catches their fancy. Because every essay is essentially just a large number, plagiarism and copyright mean a particular whole number can belong to someone for a set number of years. It means no one else can use that number without violating the claim made by the original discoverer of that number, it means numbers can be bought or sold, stolen or copied. Numbers are electronic cattle, or electronic farmland. The private ownership of a number is a literary version of the enclosure movement, a holdover of the idea that the mnemonic palace one person builds, was not built so much as discovered. But, once discovered, that palace belongs to that person during his life, and no one else can use it. 

And therein lies a question: are mnemonic palaces still the province of just one person? The ancient Greeks who created the practice of building mnemonic palaces did not use plagiarism or copyright because the palace did not exist outside of any individual's mind. Once it was stored on paper, it did. But, once a number is stored on a set of computers, copied at absolute minimal cost and available to all, can such ownership claims reasonably be enforced? Are plagiarism and copyright claims still valid, or are they akin to memory palaces, footnote formats, or multiplication tables, are they a relic of the past? If the communication of knowledge is undergoing a fundamental transformation, then are the written walls built over the last two millennia in the process of being torn down? 

The PC is less than fifty years old, the smart phone has not yet reached two score years. It is impossible to say whether or not the walls will hold and, if they hold, for how long. But as we watch knowledge and its related skillsets transform from the private mind, to the aristocratic page, and now onto the universal computer, it is worth asking the question. 


Monday, January 16, 2023

Why Universities Chose Wokeness, and Why it Won't Work

It has been predicted that half of all American colleges and universities will close in the next ten years, thus fulfilling the ancient prophecy, "Get woke, go broke." But why is wokeness such a problem for universities?

The answer: Griggs v. Duke Power Company, 1971. In that decision, the US Supreme Court found that a particular use of IQ tests in hiring practices caused a disproportionate impact on African American employees. "Disproportionate impact" can make a facially neutral policy illegal under various US civil rights laws.

This is not a blanket ban on IQ testing in employment, but corporations being risk-averse, stopped doing it. Unfortunately, they still needed a proxy for IQ tests. Whether we like it or not, an IQ score does correlate pretty well with how easily an individual can perform a specific job. So, what to do about that proxy? Well, there was one area where IQ tests were considered a useful screening tool: college students. While college in the late 19th and early 20th century was mostly about who your parents were, colleges kept the fig leaf of meritocracy bound firmly about their ivory towers. Since colleges screened applicants for parentage (a proxy for power) and for IQ (a proxy for ability to learn a job), businesses began using college degrees as their job screening tool.

The GI Bill was signed in 1944 by FDR. While roughly half of WW II's returning veterans made use of it, the funding rate dropped precipitously from 1955 into the 1960s, because college wasn't really an important item on anyone's radar. However, after 1971's pivotal Griggs decision, businesses made a college degree central. Businesses wanted employees who were hooked into either power and privilege or tour de force intellect, preferably both. The college degree guaranteed that.

This gave universities a virtual monopoly on discriminating on the basis of intelligence. Since that is really useful to do, all a university has to do is maintain its reputation as being a mostly reliable discriminator, and the actual contents of what it teaches are virtually irrelevant.

SCOTUS had unwittingly caused Griggs to rig the employment playing field. Vietnam vets were the first group to see which way the wind blew. They became the first generation to inflate their GI Bill life rafts and float into higher paying jobs. The civilian world followed suit. Universities made out like bandits.

Unfortunately for everyone, the Pill was released a decade before Griggs, just in time to start decimating the baby boom. The US total fertility rate (TFR) has been dropping steadily since 1800, with the only significant baby boom taking place after WW II. By 1963, that boom was over. But during the 1970s and early 80s, the Boomer population was still traveling through the anaconda of higher learning. Nobody realized the good times could only roll for about twenty years. It wasn't apparent that the dropping number of parents would eventually bring it all crashing down.

And that's where "wokeness" comes in. The number of available students born to American parents has steadily dropped since the 1980s. Today, there aren't enough backsides to sit in the seats. There are three ways to handle this declining enrollment: (1) raise tuition (2) import students from other countries and (3) lower standards. The first two have been tried. Tuition is as high as it can afford to be. Every industrialized country in the world has a declining TFR (and a similar problem with their own colleges), so there's a limit to the number of students that can be imported. That just leaves lowered standards.

And that's the beauty of wokeness. Wokeness lower standards while virtue-washing the real reason colleges encourage it - they need every warm body they can get, no matter how stupid, in order to fund administration perks. In fact, stupid people make better students because they will sign for larger federal loans.

So, the entire wokeness movement, which found its footing at the universities, did so because of the Pill and Griggs. But wokeness is just gasoline on the bonfire of university vanities. Remember Griggs?

You see, universities have forgotten that their degrees became valuable to businesses only because Griggs required American businesses to use an IQ proxy. Ironically, by embracing wokeness in order to save their bottom line, those same universities are throwing away the only reason a business has to use them - due to their refusal to test candidates and grade on ability, university degrees are now worthless as an IQ proxy.

How are businesses responding to this new state of affairs? They are dispensing with the need for university degrees. Instead, businesses have begun using certifying agencies (IT certification, management certification, etc.) and certifications as a proxy for IQ.

By and large, certifying agencies are a better work-around to Griggs than university degrees. They are more on-point for specific job categories, harder to litigate against, and they now do a better job of screening out stupid people. Which is all that businesses ever wanted to do in the first place.

Wednesday, January 11, 2023

America: Land of Human Smuggling

21st century Americans complain about "coyotes" smuggling people into the United States. Americans are absolutely indignant that some of those smuggled people might be criminals. It's not clear why. Human smuggling is an old American tradition. Indeed, the word "kidnapping" was invented precisely to describe the practice. Indeed, Robert Louis Stevenson's novel Kidnapped was based upon a true story of exactly this British practice.

The earliest known use of the verb kidnap is from A brief historical relation of State affairs from September 1678 to April 1714, by Narcissus Luttrell (1657-1732), annalist and book collector; he wrote that, on 23rd May 1682, there was:

"a tryall at the kings bench barr upon an indictment against Mr. John Wilmore, for spiriting or kidnapping away a young boy under the age of 13 years, called Richard Siviter, and sending him to Jamaica : the jury was a very good one, returned out of the county of Kent : the witnesses against him were some to prove that there was in generall such a trade as kidnapping or spiriting away children, and that he did beleive [sic] there had been above 500 sent away in two years at Christmas last."

Up to 75 percent of all the individuals who came off the transatlantic ships in the 17th century were indentured servants, but the European servants did not always come willingly:

Boys and girls of the poorer classes were hustled on board ships and virtually sold into slavery for a term of years. Kidnaping or ‘spiriting’ became a fine art under Charles II. Slums and alleys were raked for material to stock the plantations… About 1670 no fewer than ten thousand persons were ‘spirited’ away from England in one year. One kidnaper testified in 1671 that he had sent five hundred persons a year to the colonies for twelve years and another testified that he had sent 840 in one year.

Without Indentures: Index to White Slave Children in Colonial Court Records [Maryland and Virginia] by Richard Hayes Phillips, lists more than 5,000 children who were kidnapped from England, Ireland, Scotland and New England and sold into slavery in Maryland and Virginia from 1660 to 1720. These kidnappings were the result of the 1601 Act for the Relief of the Poor, also known as the Elizabethan Poor Law, which stipulated that children who were orphaned or whose parents were unable to support them could be taken in by parish officials and apprenticed to local tradespeople.  This law was amended in 1609, 1662 and again in 1697 and 1722, giving officials progressively more power to deal with children who were beggars or vagrants. These children were not indentured and the courts assigned their time of servitude. 

Government kidnapping of children was not an unusual event. By 1600, Queen Elizabeth had granted entertainers the right to kidnap children in order to use them as performers in the theater. Once the children were taken, the parents had little recourse. But the shenanigans did not stop with kidnapping children for transport to the colonies. In 1718, Britain passed the Transportation Act, which allowed convicts to be sold as indentured servants in the colonies. Britain shipped  approximately 60,000 convicts, dubbed "the King's passengers." Roughly ninety percent stayed in Maryland and Virginia. Between 1718 and 1775, up to one quarter of the British immigrants to America were convicts sold into servitude by the British government. According to the vicar of Wendover, transportation served the purpose of ‘draining the Nation of its offensive Rubbish’. Benjamin Franklin compared the practice to the emptying of a chamber pot on a colonial dinner table. But, the practice was  so popular in England that Daniel Defoe wrote Moll Flanders in order to support the government practice.

On both the Atlantic passage and during their servitude, European convicts were treated worse than slaves as they brought less cash, were less physically fit, and had criminal records. They were typically bought by poorer farmers who could not afford slaves. 

The French populated its Louisiana territories in much the same way the English populated their colonies. Charles Law shipped convicts and kidnapped children to the Gulf coast en masse. Being Catholic, the French actually took the time to perform mass marriages of the kidnapped children, to assure family formation and increased population once the newlywed kidnap victims arrived in their new location. Unfortunately for their plan, more than half the women and nearly a quarter of the men typically died during transport. 

So, prior to the Revolution, roughly 30 percent of American immigrants were convicts who were sentenced to be transported to the colonies and sold as indentured servants. Thousands more were kidnapped children either assigned by the courts to servitude for the crime of being an orphan, or spirited away by professional kidnappers who made their living off human trafficking. When someone tells you they can trace their lineage back to the earliest American settlers, the chances are quite good their ancestor was a convict or a kidnap victim.

Welcome to American history.

Friday, January 06, 2023

What's Wrong With Human Composting?

Human composting has become a subject of popular discussion, and I see a lot of religiously-minded people acting upset about it. For the life of me, I cannot figure out what the problem is supposed to be. How is this different from burying people in a blanket or wooden coffin, which is what Christians have done for literally thousands of years?

Embalming only really became a thing after the Civil War, it isn't required in any state in the union, and a lot of faiths (e.g., Judaism and Islam) completely forbid embalming. Actually, Christianity is weird for allowing it. Embalming poisons the soil. The embalmers are required to wear full hazmat suits, including respirator, when they do it. Embalming fluid used to contain arsenic, and 19th-century cemeteries are almost all toxic waste sites as a result, leaching arsenic into the ground water.

"Human composting" is pretty much how family cemeteries stay small. Individual family members are serially buried in the same 18 square feet of dirt over the centuries. Monasteries would commonly bury monks one on top of another, and most of their remains would decompose in the ground seamlessly over the centuries so each plot could be re-used.

What is being described in the article is not much different than how human beings have been buried for almost all of human history. What is the big deal? 

Genesis says Adam was formed from the clay, the word "adam" means "red" and is related to the Hebrew "adamah" which means land or soil. St. Paul talks of men as clay vessels (2 Cor 4:7). The Anglican Book of Common Prayer implicitly endorses composting: "we therefore commit this body to the ground, earth to earth, ashes to ashes, dust to dust; in sure and certain hope of the Resurrection to eternal life."

The Catholic liturgy does the same. On Ash Wednesday, the priest inscribes the cross on your forehead with the words, "Remember, man, thou art dust, and to dust thou shalt return." 


Friday, December 16, 2022

The Origins of Hannukah

Ironically, Hanukkah is found in Christian Scripture in 1st and 2nd Maccabees, but is not found in Jewish Scripture at all. Why? Well, these two books (there is actually also a 3 and 4 Maccabees, which some Christian groups consider part of Scripture, but which the Catholic Church does not) were originally written in Greek by the Diaspora Jewish community, they were never translated back into Hebrew. Because Christians preaching the Gospel among Diaspora Jews were using several Greek-language based books like Maccabees to great effect in conversion, the Jews who were opposed to Christianity ultimately ruled that any book not originally written in Hebrew was NOT part of Scripture and should not be considered sacred.

Unfortunately, this ruling meant the Hanukkah celebration was no longer a Scriptural event. This wasn't a huge loss, as it had never been a great holy day. But Christianity spread through Europe and North Africa, and the celebration of Christmas began to become a thing starting around 350 AD.

While it was considered a minor liturgical holiday for the first millennium of Christianity, by the medieval period, Christmas was a major cultural holiday for Christians. The early industrial period turned industrial nations into a surplus-goods society, in which Christmas came to be a way to showcase the cornucopia of goods Christian Europe was producing. Gift-giving entered the picture in a major way. Every culture (Asian, African, etc.) wanted to share in that new outpouring of Christian European wealth.

It was therefore no coincidence that by the early industrial period, Jews were converting to Christianity wholesale and retail. Indeed, some scholars point out that, if not for WW II, Jewish assimilation was on track to wipe out Jewish culture in Europe by the end of the 20th century. To slow the assimilation, one of the rabbinic tools was the elevation of Hanukkah to a major cultural event in order to combat the influence of Christmas. It's still a very minor liturgical event, but the cultural significance now pretty much swamps the religious significance. Hanukkah is now a way for Jews in a Christian society to celebrate Christmas without feeling guilty.

Saturday, December 10, 2022

Why Colleges Are Dying

Many people complain that college standards are dropping. They don't seem to understand that colleges really don't have a choice

Total fertility rate in America has been dropping steadily since 1800. The only uptick in the last two centuries was the post-WW II Boomer generation. That ended in 1963. Ever since, TFR has continued it's inexorable trend down. This means the number of students who can enter any particular college, or all colleges combined, also necessarily trends down each year. 

So, colleges MUST dumb down standards because the number of students coming in drops every year. Colleges push abortion and contraception, then are shocked to find their students don't raise kids or send them to college in turn. As a result, to keep enrollment numbers up, standards MUST drop.

But why are colleges pushing abortion and contraception? Well, eugenics has long been a darling of the Progressive movement. In addition, women are taking over both the degree programs and the administration. The kind of women who want a career are typically the kind of women who don't want children.  But even sterile women value relationships over truth. Institutions reflect the values of the people who run them. 

So, as women take over universities, universities stop being truth-seeking organizations and start being extended touchy-feely therapy sessions for sterile psychotic women who invite in every stray dog so the psychotic women can continue their extended, paid coffee klatsch sessions.

Women don't build stuff. 

Men do. 

Kick men out, and stuff don't get built.