Absinthe does a lot when you fire a GraphQL query at it. The incoming query is parsed into an internal representation, validated and finally executed. This process is done by phases, these are individual modules chained together in a pipeline that each do a single step in processing the queries.
Phases in Absinthe do a lot of work. They are the building blocks in validating and executing the GraphQL query. E.g. they validate whether the args you’ve given are non null when specified as such, and trigger errors when its not the case, or that the arguments are of the correct type, or analyze the complexity of the query etc.
The default pipeline can be seen here https://github.com/absinthe-graphql/absinthe/blob/v1.4.13/lib/absinthe/pipeline.ex#L43
Absinthe is extensible, and this means that we can also add phases in the pipeline, and do so at a place of our choosing. For this example I wanted to add a phase that logs the complexity of a document. To do so we want to add a phase after the complexity is analyzed but before the query execution is cut short because the analyzed complexity was over the max.
First, we add the new phase. I assume this is a Phoenix application and add the pipeline in the routes.ex
forward(
"/api",
Absinthe.Plug.GraphiQL,
schema: Project.Graphql.Schema,
analyze_complexity: true,
max_complexity: 1000,
pipeline: {__MODULE__, :pipeline}
)
...
def pipeline(config, pipeline_opts) do
config.schema_mod
|> Absinthe.Pipeline.for_document(pipeline_opts)
|> Absinthe.Pipeline.insert_after(
Absinthe.Phase.Document.Complexity.Analysis,
Project.Phase.LogComplexity
)
end
In the forward
function we add a callback in the pipeline keyword argument to call the pipeline/2
function in the current module. This function adds the Project.Phase.LogComplexity
module after the Absinthe.Phase.Document.Complexity.Analysis,
that’s the module responsible for analyzing the complexity. You’ll note that we enabled complexity analysis in the options given to the Absinthe plug.
The Absinthe.Phase.Document.Complexity.Result
module is the one that stops execution if the complexity is too high, it is normally called straight after Absinthe.Phase.Document.Complexity.Analysis
So, now our logging module sits in between.
What does this logging module look like
defmodule Project.Phase.LogComplexity do
use Absinthe.Phase
require Logger
def run(input, options \\ []) do
operation = Absinthe.Blueprint.current_operation(input)
fun = &handle_node(&1, &2)
{_operation, max} = Absinthe.Blueprint.prewalk(operation, 0, fun)
Logger.info("Query complexity: #{inspect(max)}")
{:ok, input}
end
def handle_node(%{complexity: complexity} = node, max) do
case complexity > max do
true -> {node, complexity}
false -> {node, max}
end
end
def handle_node(node, max) do
{node, max}
end
end
A phase takes an Absinthe.Blueprint document and returns another Blueprint document. Every phase has a run/2
function that is called for a document. In this function we pass handle_node/2
to the Absinthe.Blueprint.prewalk/3
. The first argument is the current operation, the second an accumulator, in this case 0 because that’s the minimum complexity we start out with. The third the handle handle_node/2
This prewalk will walk through every node and accumulate values. In our case we only check nodes with a complexity set. This field is set by Absinthe.Phase.Document.Complexity.Analysis
If it is set we check whether this is the highest complexity we’ve seen so far, if so, return the new highest complexity, otherwise return the older highest complexity. Nodes without a complexity field are ignored.
At the end we log the highest complexity by taking the accumulator and calling it with the logger.
I hope this gives some insight into what you can do with Absinthe phases. It is a really flexible system to work with the internal representation of GraphQL queries. If you want to write your own, I advise to look at already written phases in the Absinthe package and learn from them.
Photo by Kenrick Mills on Unsplash