Hi, I'm a newbie here and i've got a question for you. If you have a tool which does not have an existing connector but which exposes apis, what methodology do you use to extract quickly the data and make them available in your data stack ? I mean do you have a standard way to do that or it depends on the source ?
Hi, The first question is what an orchestration tool you use for ELT/ETL?For example, we consume API's with simple Python code (requests library) and then put data into DataFrame (pandas) and then export it to SQL via pandas.to_sql.
Assuming "available in your data stack" means the data is in tables in a cloud data warehouse like Snowflake/Big Query/Redshift... I'd advise looking at a cloud ELT tools like Rivery (www.rivery.io). In full transparency, I work for Rivery and joined exactly because I saw clients (was previously in consulting) looking for an easy way to connect to many API's without being charged a huge amount for each new connector. Action Rivers allow you to build a connector to a REST API in minutes without needing to know how to code.