Show HN: ActionPrompt – A Rails Plugin for Managing Your LLM Prompts

2 pointsposted 2 months ago
by evtothedev

1 Comments

graypegg

2 months ago

From the codebase you extracted this from, how are you sending these prompt strings onward? I feel like it would really benefit from being inspired by all of ActionMailer, not just ActionMailer:::Preview. Being able to just fire these off in a job would be useful. Even better if you could act on those as part of the job, so kind of like a little controller/agent thing?

    config.action_prompt.delivery_method = :chatgpt_api
    config.action_prompt.chatgpt_api_settings = {
      api_key: Rails.env[:OPENAI_API_KEY]
    }

    ...

    class ApplicationPrompt < ActionPrompt::Base
      before_action :set_user

      def set_user
        @user = current_user || AnonUser.new
      end
    end

    class SupportAgentPrompt < ApplicationPrompt
      def ask(question)
        @question = question.strip.downcase

        if result = prompt(question:)    # renders `views/prompts/support_agent/ask.text.erb`,
                                         # ideally handles recognizing JSON and stripping out whitespace in response
          broadcast_append_to @user,
                              target: :support_chat,
                              partial: 'support_chat/message',
                              locals: { message: result, from: SupportAgentUser.new }
        end
      end
    end

    ...

    SupportAgentPrompt.ask.prompt_later "What's the best menu item?"