text Factory

research on algorithmic text

Your feedback is always welcome.

intelligence

knot intelligence hand intelligence weave intelligence organic intelligence photosynthetic intelligence tactile intelligence heart intelligence walking intelligence luminous half-closed eye subliminal intelligence local knowledge fleeting moment thoughts

problematic physics

to know to no to immerse in the ecstatic electric torrent

to become fodder chalk outline vacant stalk

opaque, innumerable, unknowable model parameters

unexamined model dimensions not explicit not transparent not articulated expressed considered

goals not confessed unexamined intentions implicit profit motives capitalist frameworks

circular reasoning ::: model; training evaluation text spew becomes text input becomes confirmation

vast (yet skewed) data input becomes hyperspecific, averaged output

averaging ::: we can all sound like the mean expression of slanted datasets

these are hyper efficient colonization machines

a confirmation bias dream machine ::: it is an endless, escher-like mirror of the uber-kings of data ::: show me what a tech worker looks like ... check! looks right to me! (says the echo chamber)

resources / effort / emphasis should be expended in creating self-documenting neural leaps / compression / distillation ... filters / assumptions / the shape of data that honed parameters ::: but this has no market value & there is very little regulating the application of these highly inaccurate, unpredictable tools on real-life problems that impact real people in real time

the environmental costs of LLMs

the resources used in chatgpt requests (let alone their original large-language model type training source) are astronomical .... we will definitely kill the earth by blithely engaging in frequent "chats." Although, search engines will increasingly use AI technology thereby making us all complicit in an exponential increase in energy costs for even our simple, curious search requests.

we are in a LLM (Large Language Model) arms race between a small number of capitalist tech giants ::: sustainability of the planet isn't at the top of their list of priorities

tech is often seen being as "resource-less" "objective" "cloud-like" "all-knowing" "pure" "outside the market" just thought science progress not dirty oil-burning, skewed, colonialist, ... this is the manhattan project ... a bunch of brilliant minds (white, priviledged men) working to push forward human understandiing of the world

large language models are stochastic parrots, supercharged autocorrect .... hyper-powerful averaging machines

role of an artist

dada the data

do we just provide fodder?? ::: unpaid providers of honing advice, text, images, private data

greenwash the uber tech tools?? ::: make them seem cute ::: toys

or ::: resist ::: flood the system ::: queer-up / dada the data enact the uber ze honey hive ::: electric phosynthetic queen ::: networked revolutionaries artists of the world ::: pollinate :::

refuse ::: punk it up ::: create "bad text" "hand-hewn artifacts ... zines ... offline

do we need models trained on hundreds of billions of data points to create a poem about birds?

the tools we choose influence the work we make ::: cello, printing press, brush, pen, ...

we get to choose our tools ::: our hands ::: our bodies ::: our minds get to be a part of our creative process

references

more soon ...

AI in the classroom

THe Atlantic: AI and toxic social media potential

University of Maine conversations about AI

"Rather, it is built to maximize the extraction of wealth and profit – from both humans and the natural world – a reality that has brought us to what we might think of it as capitalism’s techno-necro stage."

Klein, N. (2023, May 8). AI machines aren’t ‘hallucinating’. But their makers are.

energy use: a description of how it works

"We need to take a step back and acknowledge that simply building ever-larger neural networks is not the right path to generalized intelligence. From first principles, we need to push ourselves to discover more elegant, efficient ways to model intelligence in machines. Our ongoing battle with climate change, and thus the future of our planet, depend on it."

energy use link

How much water does ChatGPT ‘drink’ for every 20 questions it answers?

"he high cost of training and “inference” — actually running — large language models is a structural cost that differs from previous computing booms. Even when the software is built, or trained, it still requires a huge amount of computing power to run large language models because they do billions of calculations every time they return a response to a prompt. By comparison, serving web apps or pages requires much less calculation."

cnbc: the cost of ai

"To take just one example that’s very much in the news, ChatGPT-3—which we wrote about recently—has 175 billion machine learning (ML) parameters. It was trained on high-powered NVIDIA V100 graphical processing units (GPU), but researchers say it would have required 1,024 GPUs, 34 days and $4.6 million if done on A100 GPUs.

And while energy consumption was not officially disclosed, it is estimated that ChatGPT-3 consumed 936 MWh. That’s enough energy to power approximately 30,632 US households for one day, or 97,396 European households for the same period."

Cognizant blog