I am a French full stack developper based in Bordeaux, France. I work for Codeurs en Liberté, a cooperative of freelance developpers 🧡 . My current professional interests include Elixir, cartography and mobility (open) data.
How to constrain a JSONB field in Postgres
In Postgres, JSONB fields are very useful for storing semi-structured data. But sometimes the flexibility it offers can be a little bit scary. Imagine you want to store the result of some sort of inspection, which can lead to 2 different outputs, “create” or “update”. Along with other logs. You expect your JSON field values to look like this: { "action": "create", "some_useful_log_info": ... } or { "action": "update", "reason": ....
Phoenix LiveDashboard with Content Security Policy (CSP)
If your Phoenix application enforces CSP rules, and you try to deploy the Phoenix LiveDashboard in production, you will probably get something like this: In my case, inline CSS is not loaded because of the style-src CSP rule I had to enforce on the project: style-src 'self'; This means that all unsafe inline CSS code is disabled by the browser. Unfortunately, the Phoenix LiveDashboard uses inline CSS, and that’s not something I can change....
Keep Elixir test output clean
Does your Elixir test output look like this? If so, keep reading. It is common to use the Logger library, to log things that happen in your application. def query_api(url) do with {:ok, %{status_code: 200, body: body}} <- HTTPoison.get(url), {:ok, json} <- Jason.decode(body) do {:ok, json} else e -> Logger.error("impossible to query api: #{inspect(e)}") {:error, "api service unavailable"} end end It is also common to test the behavior of your application, in different scenarios....
Stream S3 > zip > CSV > Postgres with Elixir
I was recently confronted with this situation: an S3 bucket contains a zip file the zip file contains a CSV I want to take the content of that CSV, transform it, and push it to a Postgres database Without streaming Without streaming, the steps to do that are the following: download the zip archive locally (maybe it is huge!) unzip the archive open the CSV file and load its content in memory (maybe it is huge!...
Stream data from an API to your database with Elixir
Make your Elixir streams come true Last time we talked about simply streaming a paginated API with Elixir. It was fun, but somehow pointless, because we ended up writing this: datasets = stream_api.("https://data.gouv.fr/api/1/datasets/") |> Stream.take(50) |> Enum.to_list() Streams in Elixir are lazy, meaning that no work will be done by the stream until necessary. If you just write: datasets = stream_api.("https://data.gouv.fr/api/1/datasets/") |> Stream.take(50) You create a stream, say that you are only interested by the first 50 elements....
Stream a paginated API with Elixir
The goal You want to fetch data from an external API with Elixir, and this API is paginated. Your code needs to do make API calls, but just the right number of them. Maybe you’re only interested by the first 2 pages of results, and don’t want to fetch everything. We’ll use for this example the data.gouv.fr api, the French portal for open data. Let’s request a list of all datasets available....