Common API Entry from Postgres and SQLite – O’Reilly


In SQL: The Common Solvent for REST APIs we noticed how Steampipe’s suite of open-source plugins that translate REST API calls immediately into SQL tables. These plugins had been, till not too long ago, tightly sure to the open-source engine and to the occasion of Postgres that it launches and controls. That led members of the Steampipe neighborhood to ask: “Can we use the plugins in our personal Postgres databases?” Now the reply is sure—and extra—however let’s concentrate on Postgres first.

NOTE: Every Steampipe plugin ecosystem is now additionally a standalone foreign-data-wrapper extension for Postgres, a virtual-table extension for SQLite, and an export instrument.




Study quicker. Dig deeper. See farther.

Utilizing a Steampipe Plugin as a Standalone Postgres Overseas Knowledge Wrapper (FDW)

Go to Steampipe downloads to search out the installer to your OS, and run it to accumulate the Postgres FDW distribution of a plugin—on this case, the GitHub plugin. It’s certainly one of (at present) 140 plugins obtainable on the Steampipe hub. Every plugin gives a set of tables that map API calls to database tables—within the case of the GitHub plugin, 55 such tables. Every desk can seem in a FROM or JOIN clause; right here’s a question to pick out columns from the GitHub subject, filtering on a repository and creator.

choose
  state,
  updated_at,
  title,
  url
from
  github_issue
the place
  repository_full_name="turbot/steampipe"
  and author_login = 'judell'
order by
  updated_at desc

For those who’re utilizing Steampipe you may set up the GitHub plugin like this:

steampipe plugin set up github

then run the question within the Steampipe CLI, or in any Postgres consumer that may hook up with Steampipe’s occasion of Postgres.

However if you wish to do the identical factor in your personal occasion of Postgres you may set up the plugin another way.

$ sudo /bin/sh -c "$(
   curl -fsSL https://steampipe.io/set up/postgres.sh)"
Enter the plugin title: github
Enter the model (newest): 

Found:
- PostgreSQL model:   14
- PostgreSQL location:  /usr/lib/postgresql/14
- Working system:     Linux
- System structure:  x86_64

Primarily based on the above, steampipe_postgres_github.pg14.linux_amd64.tar.gz
will likely be downloaded, extracted and put in at: /usr/lib/postgresql/14

Proceed with putting in Steampipe PostgreSQL FDW for model 14 at
 /usr/lib/postgresql/14?
- Press 'y' to proceed with the present model.
- Press 'n' to customise your PostgreSQL set up listing 
and choose a special model. (Y/n): 


Downloading steampipe_postgres_github.pg14.linux_amd64.tar.gz...
###############################################################
############################ 100.0%
steampipe_postgres_github.pg14.linux_amd64/
steampipe_postgres_github.pg14.linux_amd64/steampipe_postgres_
github.so
steampipe_postgres_github.pg14.linux_amd64/steampipe_postgres_
github.management
steampipe_postgres_github.pg14.linux_amd64/steampipe_postgres_
github--1.0.sql
steampipe_postgres_github.pg14.linux_amd64/set up.sh
steampipe_postgres_github.pg14.linux_amd64/README.md

Obtain and extraction accomplished.

Putting in steampipe_postgres_github in /usr/lib/postgresql/14...

Efficiently put in steampipe_postgres_github extension!

Recordsdata have been copied to:
- Library listing: /usr/lib/postgresql/14/lib
- Extension listing: /usr/share/postgresql/14/extension/

Now hook up with your server as normal, utilizing psql or one other consumer, most sometimes because the postgres person. Then run these instructions that are typical for any Postgres international knowledge wrapper. As with all Postgres extensions, you begin like this:

CREATE EXTENSION steampipe_postgres_fdw_github;

To make use of a international knowledge wrapper, you first create a server:

CREATE SERVER steampipe_github FOREIGN DATA WRAPPER
steampipe_postgres_github OPTIONS (config 'token="ghp_..."');

Use OPTIONS to configure the extension to make use of your GitHub entry token. (Alternatively, the usual surroundings variables used to configure a Steampipe plugin—it’s simply GITHUB_TOKEN on this case—will work in the event you set them earlier than beginning your occasion of Postgres.)

The tables supplied by the extension will reside in a schema, so outline one:

CREATE SCHEMA github;

Now import the schema outlined by the international server into the native schema you simply created:

IMPORT FOREIGN SCHEMA github FROM SERVER steampipe_github INTO github;

Now run a question!

The international tables supplied by the extension reside within the github schema, so by default you’ll seek advice from tables like github.github_my_repository. For those who set search_path="github", although, the schema turns into non-obligatory and you may write queries utilizing unqualified desk names. Right here’s a question we confirmed final time. It makes use of the github_search_repository which encapsulates the GitHub API for looking out repositories.

Suppose you’re in search of repos associated to PySpark. Right here’s a question to search out repos whose names match pyspark, and report just a few metrics that can assist you gauge exercise and recognition.

choose
  name_with_owner,
  updated_at,     -- how not too long ago up to date?
  stargazer_count -- how many individuals starred the repo?
from 
  github_search_repository 
the place 
  question = 'pyspark in:title' 
order by
  stargazer_count desc
restrict 10;
+---------------------------------------+------------+---------------+
|name_with_owner                        |updated_at  |stargazer_count|
+---------------------------------------+------------+---------------+
| AlexIoannides/pyspark-example-project | 2024-02-09 | 1324          |
| mahmoudparsian/pyspark-tutorial       | 2024-02-11 | 1077          |
| spark-examples/pyspark-examples       | 2024-02-11 | 1007          |
| palantir/pyspark-style-guide          | 2024-02-12 | 924           |
| pyspark-ai/pyspark-ai                 | 2024-02-12 | 791           |
| lyhue1991/eat_pyspark_in_10_days      | 2024-02-01 | 719           |
| UrbanInstitute/pyspark-tutorials      | 2024-01-21 | 400           |
| krishnaik06/Pyspark-With-Python       | 2024-02-11 | 400           |
| ekampf/PySpark-Boilerplate            | 2024-02-11 | 388           |
| commoncrawl/cc-pyspark                | 2024-02-12 | 361           |
+---------------------------------------+------------+---------------+

When you’ve got a number of repos, the primary run of that question will take just a few seconds. The second run will return outcomes immediately, although, as a result of the extension features a highly effective and complicated cache.

And that’s all there may be to it! Each Steampipe plugin is now additionally a international knowledge wrapper that works precisely like this one. You may load a number of extensions so as to be a part of throughout APIs. In fact you may be a part of any of those API-sourced international tables with your personal Postgres tables. And to avoid wasting the outcomes of any question, you may prepend “create desk NAME as” or “create materialized view NAME as” to a question to persist outcomes as a desk or view.

Utilizing a Steampipe Plugin as a SQLite Extension that Supplies Digital Tables

Go to Steampipe downloads to search out the installer to your OS and run it to accumulate the SQLite distribution of the identical plugin.

$ sudo /bin/sh -c "$(curl -fsSL https://steampipe.io/set up/sqlite.sh)"
Enter the plugin title: github
Enter model (newest): 
Enter location (present listing): 

Downloading steampipe_sqlite_github.linux_amd64.tar.gz...
############################################################
################ 100.0%
steampipe_sqlite_github.so

steampipe_sqlite_github.linux_amd64.tar.gz downloaded and 
extracted efficiently at /house/jon/steampipe-sqlite.

Right here’s the setup, and you may place this code in ~/.sqliterc if you wish to run it each time you begin sqlite.

.load /house/jon/steampipe-sqlite/steampipe_sqlite_github.so

choose steampipe_configure_github('
  token="ghp_..."
');

Now you may run the identical question as above. Right here, too, the outcomes are cached, so a second run of the question will likely be immediate.

What in regards to the variations between Postgres-flavored and SQLite-flavored SQL? The Steampipe hub is your good friend! For instance, listed here are Postgres and SQLite variants of a question that accesses a area inside a JSON column so as to tabulate the languages related along with your gists.

Postgres

SQLite

The github_my_gist desk experiences particulars about gists that belong to the GitHub person who’s authenticated to Steampipe. The language related to every gist lives in a JSONB column referred to as information, which comprises a listing of objects like this.

{
   "measurement": 24541,
   "kind": "textual content/markdown",
   "raw_url": "https://gist.githubusercontent.com/judell/49d66ca2a5d2a3b
   "filename": "steampipe-readme-update.md",
   "language": "Markdown"
}

The features wanted to undertaking that record as rows differ: in Postgres you utilize jsonb_array_elements and in SQLite it’s json_each.

As with Postgres extensions, you may load a number of SQLite extensions so as to be a part of throughout APIs. You may be a part of any of those API-sourced international tables with your personal SQLite tables. And you’ll prepend create desk NAME as to a question to persist outcomes as a desk.

Utilizing a Steampipe Plugin as a Standalone Export Software

Go to Steampipe downloads to search out the installer to your OS, and run it to accumulate the export distribution of a plugin—once more, we’ll illustrate utilizing the GitHub plugin.

$ sudo /bin/sh -c "$(curl -fsSL https://steampipe.io/set up/export.sh)"
Enter the plugin title: github
Enter the model (newest): 
Enter location (/usr/native/bin): 
Created non permanent listing at /tmp/tmp.48QsUo6CLF.

Downloading steampipe_export_github.linux_amd64.tar.gz...
##########################################################
#################### 100.0%
Deflating downloaded archive
steampipe_export_github
Putting in
Making use of vital permissions
Eradicating downloaded archive
steampipe_export_github was put in efficiently to
/usr/native/bin
$ steampipe_export_github -h
Export knowledge utilizing the github plugin.

Discover detailed utilization info together with desk names, 
column names, and examples on the Steampipe Hub:
https://hub.steampipe.io/plugins/turbot/github

Utilization:
  steampipe_export_github TABLE_NAME [flags]

Flags:
      --config string       Config file knowledge
  -h, --help                assist for steampipe_export_github
      --limit int           Restrict knowledge
      --output string       Output format: csv, json or jsonl 
(default "csv")
      --select strings      Column knowledge to show
      --where stringArray   the place clause knowledge

There’s no SQL engine within the image right here, this instrument is solely an exporter. To export all of your gists to a JSON file:

steampipe_export_github github_my_gist --output json > gists.json

To pick just some columns and export to a CSV file:

steampipe_export_github github_my_gist --output csv --select
 "description,created_at,html_url" > gists.csv

You need to use --limit to restrict the rows returned, and --where to filter them, however principally you’ll use this instrument to shortly and simply seize knowledge that you just’ll therapeutic massage elsewhere, for instance, in a spreadsheet.

Faucet into the Steampipe Plugin Ecosystem

Steampipe plugins aren’t simply uncooked interfaces to underlying APIs. They use tables to mannequin these APIs in helpful methods. For instance, the github_my_repository desk exemplifies a design sample that applies constantly throughout the suite of plugins. From the GitHub plugin’s documentation:

You may personal repositories individually, or you may share possession of repositories with different folks in a corporation. The github_my_repository desk will record repos that you just personal, that you just collaborate on, or that belong to your organizations. To question ANY repository, together with public repos, use the github_repository desk.

Different plugins observe the identical sample. For instance, the Microsoft 365 plugin gives each microsoft_my_mail_message and microsoft_mail_message, and the plugin gives googleworkspace_my_gmail_message and googleworkspace_gmail. The place doable, plugins consolidate views of assets from the attitude of an authenticated person.

Whereas plugins sometimes present tables with mounted schemas, that’s not all the time the case. Dynamic schemas, carried out by the AirtableCSVKubernetes, and Salesforce plugins (amongst others) are one other key sample. Right here’s a CSV instance utilizing a standalone Postgres FDW.

IMPORT FOREIGN SCHEMA csv FROM SERVER steampipe_csv INTO csv 
 OPTIONS(config 'paths=["/home/jon/csv"]');

Now all of the .csv information in /house/jon/csv will automagically be Postgres international tables. Suppose you retain monitor of legitimate homeowners of EC2 situations in a file referred to as ec2_owner_tags. Right here’s a question in opposition to the corresponding desk.

choose * from csv.ec2_owner_tags;
     proprietor      |            _ctx
----------------+----------------------------
 Pam Beesly     | {"connection_name": "csv"}
 Dwight Schrute | {"connection_name": "csv"}

You could possibly be a part of that desk with the AWS plugin’s aws_ec2_instance desk to report proprietor tags on EC2 situations which might be (or should not) listed within the CSV file.

choose 
    ec2.proprietor,
    case 
        when csv.proprietor is null then 'false'
        else 'true'
    finish as is_listed
from 
    (choose distinct tags ->> 'proprietor' as proprietor 
     from aws.aws_ec2_instance) ec2
left be a part of 
    csv.ec2_owner_tags csv on ec2.proprietor = csv.proprietor;
     proprietor      | is_listed
----------------+-----------
 Dwight Schrute | true
 Michael Scott  | false

Throughout the suite of plugins there are greater than 2,300 pre-defined fixed-schema tables that you should utilize in these methods, plus a vast variety of dynamic tables. And new plugins are continually being added by Turbot and by Steampipe’s open-source neighborhood. You may faucet into this ecosystem utilizing Steampipe or Turbot Pipes, from your personal Postgres or SQLite database, or immediately from the command line.



Leave a Reply

Your email address will not be published. Required fields are marked *

www.com homemadeporntrends.com cock sniffing
www inbia sex com duporn.mobi indian sex scandel
demon hentai freecartoonporn.info hentai sleep
سكس اغتصاب في المطبخ pornfixy.com نيك بنت عمه
village hentai hentai4you.org yuri and friends 9
sex movies telugu directorio-porno.com www sex hd vido
نيك بجد 3gpking.name صور سكس متحركة جامدة
yuki hentai hentaisharing.net kakasaku hentai
سكسي امهات roughtube.org نيك نبيله عبيد
www.drtuber.com hdmovz.mobi sambhog video
dvdwap.in hindipornsite.com xnxx indian lesbian
xyriel manabat instagram onlineteleserye.com flower sisters gma
indianxxxvidio pornodoza.org indians x videos
hot hot hard sex sexy movies licuz.mobi indian hot porn movies
porn hammer flexporn.net sex videos delhi