SmartSuite AI with Live Internet

A Simple Experiment with Google Apps Script, PaLM 2, and SmartSuite

My interest in SmartSuite has grown recently, so I thought I’d try to understand its API boundaries by instrumenting an automated AI workflow. Once that worked, I turned my attention to an approach that includes live Internet content much like I did for Coda. I chose Google Apps Script for the scripting environment for this initial SmartSuite excursion because it’s so simple to script up API calls and deploy them for automated updates. I already had a rich supply of functions I had developed for Google’s PaLM 2 LLM that I write about on this Substack from time to time.

Generative AI is Big

Anyone who says it isn’t, probably thought the World Wide Web was a fad. It will change everything. But to do so, each of our little microcosms must adapt. SmartSuite’s initial step in the AI direction, a writing assistant in SmartDoc fields, is wonderful. But it requires humans to do work, and by work, I mean getting good at AI prompts, which is clearly a challenge.

Inferences need to possess three key attributes to pay hyper-dividends.

  1. Templated

  2. Abstract

  3. Automated

Unless you can define prompts that perform fluidly in a data-centric climate, the hyper-productivity made possible by generative AI will be fleeting at best, and counter-productive at worst. Let’s examine the current SmartDoc AI feature as it relates to these three attributes.

It’s a one-and-done feature. You select a predefined prompt or write one yourself, and it generates content using the OpenAI LLM. If you don’t create the ideal prompt, you must start over. The insert below feature is a plus, but this is not a template - it cannot be saved or reused repeatedly.

The AI results are not abstracted from the writing canvas. This is to say that this AI feature cannot exist as a component that can be refreshed, tuned, or modified. It is not an object that an automation process can address, nor an object that the author can fine-tune after the first instance is rendered.

These are all valid shortcomings, but I understand — it’s beta and early in the march toward the full embrace of generative AI.

Project Vision

In my zest to understand SmartSuite’s API, I quickly envisioned a SmartDoc field as an AI canvas—a place to render the results for external generative AI workflows. I’m a fan of the SmartDoc field because I’ve long believed that data-centric solutions need to lean on narratives as much as text-centric solutions need to lean on data. It’s no secret that I’ve found a delightful balance in Coda.

Could AI write an entire report and place it in a SmartDoc? Could a collection of rows do this for different reporting outcomes?

Achieving this, even without the inclusive benefits of data from the live Internet, is advantageous. It meets all three of the key attributes for realizing the hyper-productive benefits of generative AI. However, achieving this in the current state of SmartSuite requires using its API (as far as I know).

CAUTION: I am new to SmartSuite, and there’s a possibility I missed some key functionality concerning the ability to blend generative AI into rows and fields. I also believe that Make or many of the workflow adhesives could be used to achieve similar outcomes. Still, I wanted to explore a sustainable pathway that didn’t involve higher costs or uncontrolled dependencies on middle-tier services.

Exploration Objectives

Everything stems from the main function, Refresh AI Fields. The premise of this automated workflow is simple - at some interval, refresh all the AI fields. What defines an AI field? In my approach, any field name that begins with "AI.". The main function pulls the records from the designated app, looks for fields that contain AI prompts, and processes the prompts, sending the AI output to the designated field in the prompt.

In this example, the prompt is a formula that blends the data from each row; [News Live] is the designated SmartDoc field that will receive the inference output.

Example AI Prompt Field

In this formula field, I’ve instrumented the prompt to specify a dynamic stock symbol and a reference to include live Internet data. But what Internet data exactly?

This live data included in the AI inference is achieved by a call to SerpAPI whose results are compressed to ensure we can pass the results safely to the LLM in less than four-k bytes. The query bound by double curly brackets narrowly identifies the live Internet search results we want to hand the AI process to ensure its output is grounded in the most recent known information about the topic - on this example, recent news about Rivian’s stock.

The outcome of this inference is updated into the [News Live] target field,  in this example. In contrast to inferences lacking access to the open web, this response includes news items from 2023. Formatting is not perfect, but passable.

Example Live AI Output

My prototype supports both live web and static inferencing based on the LLM (PaLM 2 in this case). It stopped training from the Web in early 2023, and this stasis is reflected in the News output on the left. But as you can see, the News Live field includes stories as recent as a few weeks ago. With GPT models, the contrast is worse - it sopped learning in mid- 2021, hence the push to create plugins that overcome this  this weakness.

Comparative Static (left) and Live (right) AI Outputs

Takeaways

There’s no doubt that generative AI and SmartSuite have a bright future. I look forward to blending these ideas seamlessly and without writing a lot of code. Until then, we code...

You'll find the source code here.

7
3 replies