Your Web News in One Place

Help Webnuz

Referal links:

Sign up for GreenGeeks web hosting
April 19, 2023 10:07 pm GMT

Service Objects in Ruby on Rails: Best Practices for Clean and Maintainable Code

Many times we need to communicate our ROR application with the outside world, either to consume a service that will provide us with information of some type or to send information ourselves.

And we can do this in two ways,
The caveman's way, which is simple using HTTParty and including a method to contact the API within our controller or model. This will work without any problems, it's fast, and perhaps easy to maintain if it's only your project.

The Ruby on Rails way, in this way, we will create a directory for services in our project called "services", inside the "app" directory, where we will put all the external services or APIs that our project consumes.

Taking advantage of the fact that it is fashionable, we are going to connect with Chat GPT.

Ok, let's get to work!

Install HTTParty

gem 'httparty'

We're going to use the HTTParty gem to perform all our HTTP requests.
To install it just add the above piece of code into your Gemfile and then run bundle install.

Service configuration

To communicate con Chat GPT we need to pass an API KEY or TOKEN in our request headers, and to keep it secure, we will save it into the Rails credentials file.
You can edit the credentials file by running this command in your terminal

EDITOR="nano" bin/rails credentials:edit

That will open your credentials file using "nano" as text editor.
Once you are editing it, add the following configuration line

chatgpt_api_key: YOUR-API-KEY

Please save and close the editor, if you're using nano, Ctrl + x will do the trick.

Starting making the service

We can treat all our services as modules, see the following structure

  chatgpt_service git:(master)  tree app/servicesapp/services chat_gpt  data   chat_completions_choice_item_data.rb   completions_choice_item_data.rb   models_item_data.rb   models_list_data.rb  responses   base_response.rb   chat_completions_response.rb   completions_response.rb   models_list_response.rb  service.rb http_response.rb4 directories, 10 files

The idea is to have a module to encapsulate the API we want to consume, and inside it the Service class with all methods to request or send information.
Inside responses we will store our classes which will be in charge to manage the information returned by the API.
In case the API returns complex data, or many objects inside each other, you could use some data classes, the ones you can see inside the data directory.

Don't be worried about the code, since I prepared a public Github repository with the project, take a look at it at: https://github.com/paul-ot/ror_chatgpt_client

To explain the above service structure we could say that the service will have one public method for each endpoint we will use, for example in our case we have three public methods in the service,
completions, chat_completions and models_list.

module ChatGpt  class Service    include HTTParty    base_uri 'https://api.openai.com/v1'    # ...    def completions(prompt, model = nil)      @model = model unless model.nil?      response = self.class.post(        COMPLETIONS_PATH,        body: completions_body(prompt),        headers: common_headers,        timeout: TIMEOUT_SECONDS      )      ChatGpt::Responses::CompletionsResponse.new(response)    end    def chat_completions(message, model = nil)      @model = model unless model.nil?      response = self.class.post(        COMPLETIONS_PATH,        body: chat_completions_body(message),        headers: common_headers,        timeout: TIMEOUT_SECONDS      )      ChatGpt::Responses::ChatCompletionsResponse.new(response)    end    def models_list      response = self.class.get(        MODELS_PATH,        headers: common_headers,        timeout: TIMEOUT_SECONDS      )      ChatGpt::Responses::ModelsListResponse.new(response)    end    # ...  endend

Each method will return its own response, it means that each method will return a different response, a different class, stored inside the responses directory, we do that on that way in order to isolate the responses, creating different methods to get the information from the JSON responses.

Pay attention to the responses

Completions method:

    def completions(prompt, model = nil)      # ...      ChatGpt::Responses::CompletionsResponse.new(response)    end

Chat completions method:

    def chat_completions(message, model = nil)      # ...      ChatGpt::Responses::ChatCompletionsResponse.new(response)    end

The models list:

    def models_list      # ...      ChatGpt::Responses::ModelsListResponse.new(response)    end

Here each response class inherits from a ChatGpt::Responses::BaseResponse class which contains the common methods across responses, the error message.

BaseResponse

module ChatGpt  module Responses    class BaseResponse < HttpResponse      def error_message        response_body.dig('error', 'message')      end    end  endend

At the same time, ChatGpt::Responses::BaseResponse inherits from HttpResponse class, which contains the very common methods, for example the one in charge of determines whether response was ok or not, and also converts the response into JSON object.

HttpResponse

class HttpResponse  attr_reader :response  def initialize(response)    @response = response  end  def response_body    response.body.present? ? JSON.parse(response.body) : {}  rescue JSON::ParserError    {}  end  def successful?    response.code.to_i == Rack::Utils::SYMBOL_TO_STATUS_CODE[:ok]  end  def failed?    !successful?  endend

I did that on that way to make it more reusable, so you can have several services each with their own responses, inheriting from this HttpResponse class.

About the usage of this service, it is very simple

1- Create a service instance

> service = ChatGpt::Service.new=> #<ChatGpt::Service:0x00007fccde2ed0d0 @model="gpt-3.5-turbo">

2- You can try to get the GPT models list as follows

response = service.models_list=> #<ChatGpt::Responses::ModelsListResponse:0x00007fccdd7d6390 @response=#<HTTParty::Response:0x7fccdd6c7ad0 parsed_response={"object"=>"list", "data"=>[{"id"=>"babbage", "object"=>"model", "created"=>1649358449, "owned_by"=>"openai", "permission"=>[{"id"=>"modelperm-49FUp5v084tBB49tC4z8LPH5", ...>> response.items.count=> 64> response.items.first=> #<ChatGpt::Data::ModelsItemData:0x00007fccde9cfa60 @params={"id"=>"babbage", "object"=>"model", "created"=>1649358449, "owned_by"=>"openai", "permission"=>[{"id"=>"modelperm-49FUp5v084tBB49tC4z8LPH5", "object"=>"model_permission", "created"=>1669085501, "allow_create_engine"=>false, "allow_sampling"=>true, "allow_logprobs"=>true, "allow_search_indices"=>false, "allow_view"=>true, "allow_fine_tuning"=>false, "organization"=>"*", "group"=>nil, "is_blocking"=>false}], "root"=>"babbage", "parent"=>nil}>

For more examples please refer to the Github repository, it has a README.md file with them.

I hope I have contributed something useful, and I hope I have been able to explain my point of view about the use of services in Ruby on Rails.

Thanks for reading!


Original Link: https://dev.to/paulmarclay/service-objects-in-ruby-on-rails-best-practices-for-clean-and-maintainable-code-2odk

Share this article:    Share on Facebook
View Full Article

Dev To

An online community for sharing and discovering great ideas, having debates, and making friends

More About this Source Visit Dev To