BIML Speaks at CCSC Eastern

As independent scholars, we have a huge amount of respect for professors and students of Computer Science at small colleges in the United States. We were proud to participate as the dinner speaker at the CCSC Eastern Conference this year.

Our payment was a cool T-shirt and some intellectual stimulation. (Now you know why McGraw never takes selfies.)

One time student of mine at Earlham College, one time employee of mine at Cigital, and now the infamous daveho (author of Find Bugs).

A visit to IU Bloomington

Sometimes it pays to stop and think, especially if you can surround yourself with some exceptional grad students. On the way to Rose-Hulman, BIML made a pit stop in Bloomington for a dinner focused on two papers: Vaswani’s 2017 Attention is All You Need (defining the transformer architecture) also see https://berryvilleiml.com/bibliography/ and Dennis “the antecedents of transformer models” (which will appear in Current Directions in Psychological Science soon.

The idea was to explore and critique the architectural decisions underlying the Transformer architecture. Bottom line? Most of them were made for efficiency reasons. There is lots of room for better cognitively-inspired ML. Maybe efficiency is NOT all you need.

We did this all over delicious Korean food at Hoosier Seoulmate.

Special thanks to Rob Goldstone who provided the Dennis manuscript and grounded the cognitive psychology thread and to Eli McGraw who conjured up the dinner from thin air.

The Lake Monroe home away from home.

Invited Talk at Rose-Hulman Institute of Technology

Dr. McGraw gave a talk Wednesday 10/16/24 at Rose-Hulman in Terre Haute, Indiana. This version of the talk is aimed at Computer Science students. There were some very good questions.

Calypso Dublin Panel Features BIML

Here is a video of the Dublin panel recorded October 3rd 2024. This was quite an excellent event. Have a watch.

It’s the Data, Stupid

Dan Geer came across this marketing thingy and sent it over. It serves to remind us that when it comes to ML, it’s all about the data.

Take a look at this LAWFARE article we wrote with Dan about data feudalism.

Welcome to the era of data feudalism. Large language model (LLM) foundation models require huge oceans of data for training—the more data trained upon, the better the result. But while the massive data collections began as a straightforward harvesting of public observables, those collections are now being sectioned off. To describe this situation, consider a land analogy: The first settlers coming into what was a common wilderness are stringing that wilderness with barbed wire. If and when entire enormous parts of the observable internet (say, Google search data, Twitter/X postings, or GitHub code piles) are cordoned off, it is not clear what hegemony will accrue to those first movers; they are little different from squatters trusting their “open and notorious occupation” will lead to adverse possession. Meanwhile, originators of large data sets (for example, the New York Times) have come to realize that their data are valuable in a new way and are demanding compensation even after those data have become part of somebody else’s LLM foundation model. Who can gain access control for the internet’s publicly reachable data pool, and why? Lock-in for early LLM foundation model movers is a very real risk.

BIML Livestream 7/11/24 2pm EST: Deciphering AI: Unpacking the Impact on Cybersecurity

BIML enthusiasts may be interested in this, which co-founder Gary McGraw participated in. 

Deciphering AI: Unpacking the Impact on Cybersecurity By Lindsey O’Donnell-Welch

Also features Phil Venables CISO of Google Cloud and Nathan Hamiel from Blackhat.

Here’s the Decipher landing page where the event tomorrow will be livestreamed: https://duo.com/decipher/deciphering-ai-unpacking-the-impact-on-cybersecurity

It will also be livestreamed on the Decipher LinkedIn page: https://www.linkedin.com/events/decipheringai-unpackingtheimpac7207405768856219648/theater/

Streaming will begin July 11 at 2pm ET.