I ‘d been needing to refactor the pagination logic in the Mastodon plugin for Steampipe. After a number of abortive shots, I took another run at it this week with the help of the current generation of LLM-powered coding assistants.Here was the issue. The pre-release variation of the plugin combined pagination for numerous tables in one place. That was a good idea, but the disadvantage was that there was only one Steampipe table which represented what should have been a lot of them. So you might state select * from mastodon_timeline however then you had qualify with where timeline=’house’ or where timeline=’local’ and so on. For a user of the plugin this was awkward, you ‘d rather state select * from mastodon_timeline_home or select * from mastodon_timeline_local, and reserve the where clause for more particular purposes.The v1 plugin made separate tables, but duplicated the pagination reasoning on a per-table basis. It worked, and was good enough to deliver the plugin in time to demo at FediForum, however it obviously needed improvement.ChatGPT-4 and Sourcegraph Cody Ever since, Sourcegraph has launched its new coding assistant, Cody, which you can run as a VS Code extension or on sourcegraph.com. This set up the possibility for an intriguing contrast. ChatGPT-4 develops on OpenAI’s LLM; Sourcegraph’s Cody, on the other hand, utilizes Anthropic’s Claude. Another key distinction is that ChatGPT only has the context you paste into it. Cody, sitting inside VS Code, can see your repository and has all that context. And if you index your repo, which is something Sourcegraph want to do for beta users on demand, then Cody has access to what are called embeddings that represent the structure of your code in numerous methods. These embeddings, according to Sourcegraph, can powerfully enhance your LLM prompts.Even without embeddings, Cody uses rather a range of assistance, from a top-level introduction of what your repo does to line-level enhancement. It’s all packaged, in the extension, as a set of dishes behind buttons with names like Explain picked code, Enhance variable names, and Odor code. I haven’t yet used these recipes enough to form solid opinions, however. For this workout I used Cody mostly in a ChatGPT-like conversational method.
In that mode, it’s wonderful to be able to select the code you want to speak about, rather of pasting it into the chat. In both cases, as need to be no surprise, it wasn’t enough to simply ask the tools to consolidate the pagination logic. They were completely delighted to propose options that could never ever work andmight not even compile. So I started with an easier version of the problem. Mastodon utilizes the exact same pagination machinery for APIs that return selections of various kinds of results: Statuses(toots), Accounts, and Alerts. By focusing on these individually I decreased the duplicate
pagination from 13 circumstances to three. Then, in a separate pass, I worked out how to collapse those into a single paginate function that accepted one of 3 data-fetching function parameters.I tried to pay careful attention to prompts and conclusions as I went along, but in the heat of the action I didn’t do a fantastic job of that, partly because I was switching backward and forward in between the two tools. But I’m quite pleased with the outcome. There was one key insight in specific which, fascinatingly, I am hard pushed to assign credit for. Was it me or among the assistants? I think it was me, but in a way that does not matter, and isn’t the point of this story. The key insight Here was the insight
. When I was constructing the transitional paginateStatus function, the first attempt returned outcomes to the calling code in each table’s List function, which was responsible for streaming the data to Steampipe. This led to a series of detours to work around the problem that the returned information might be rather large, and chew up a great deal of memory. That could probably be resolved with a goroutine that would stream results back to the caller, instead of returning them as
a batch. I attempted prodding both LLMs to come up with that sort of service, had no luck with a number of tries in both cases, however then came the insight. The helper functions could stream results directly to Steampipe, and simply return nil or err to the calling List function.With that dramatic simplication I had the ability to complete the stage 1 refactoring, which yielded three pagination functions: paginateStatus, paginateAccount, and paginateNotification. Stage 2, which combined those into a single paginate function, was a bit more prosaic. I did need some help understanding how the needed switch declarations might switch on the timeline types entered the paginate function. Both assistants had actually seen great deals of examples of this pattern, and both helpfully augmented my imperfect understanding of golang
idioms.Partnering with maker intelligence I came away with a profound sense that the real worth of these assistants isn’t any particular piece of code that they get”right” or”incorrect”however rather the procedure of working together with them. When you’re working alone, you have an ongoing discussion with yourself, typically in your own head. The point of speaking with a rubber duck is to voice that conversation so you can more effectively factor about it.Externalizing your thinking because way is intrinsically important. But when the rubber duck talks back, it’s an entire brand-new video game. As Garry Kasparov notoriously composed: The winner
was revealed to be not a grandmaster with an advanced PC but a pair of amateur American chess players using 3 computer systems at the exact same time. Their skill at manipulating and training their computers to look really deeply into positions effectively combated the superior chess understanding of their grandmaster opponents and the greater computational power of other individuals. Weak human+device+better process transcended to a strong computer alone and, more extremely, superior to a strong human+maker+inferior process. I’m not fretted about robot overlords. Rather, I look forward to collaborating with robotic partners.This series: Autonomy, packet size, friction, fanout, and speed Mastodon, Steampipe, and RSS Browsing the fediverse A Bloomberg terminal for Mastodon Develop
your own Mastodon UX Lists and individuals on Mastodon How many individuals in my Mastodon feed likewise tweeted today? Instance-qualified Mastodon URLs Mastodon relationship graphs Working with Mastodon lists Images considered damaging(sometimes) Mapping the broader fediverse Protocols, APIs, and conventions News in the fediverse Mapping individuals and tags in Mastodon Picturing Mastodon server small amounts Mastodon timelines for groups The Mastodon plugin is now available on the Steampipe Hub Migrating Mastodon lists When the rubber duck talks back Copyright © 2023 IDG Communications, Inc. Source
your own Mastodon UX Lists and individuals on Mastodon How many individuals in my Mastodon feed likewise tweeted today? Instance-qualified Mastodon URLs Mastodon relationship graphs Working with Mastodon lists Images considered damaging(sometimes) Mapping the broader fediverse Protocols, APIs, and conventions News in the fediverse Mapping individuals and tags in Mastodon Picturing Mastodon server small amounts Mastodon timelines for groups The Mastodon plugin is now available on the Steampipe Hub Migrating Mastodon lists When the rubber duck talks back Copyright © 2023 IDG Communications, Inc. Source