| yuchiu/Netflix-Clone |
245 |
|
0 |
0 |
over 3 years ago |
0 |
|
67 |
mit |
JavaScript |
| Netflix like full-stack application with SPA client and backend implemented in service oriented architecture |
| el10savio/GoCrawler |
60 |
|
0 |
0 |
over 5 years ago |
0 |
|
0 |
mit |
Go |
| A distributed web crawler implemented using Go, Postgres, RabbitMQ and Docker |
| allanbmartins/Projeto_ETL_RFB_IBGE_ANP |
38 |
|
0 |
0 |
about 2 years ago |
0 |
|
0 |
mit |
Python |
| PYTHON E POSTGRESQL - EXTRACT TRANSFORM LOAD - ETL - DADOS PÚBLICOS DA RECEITA FEDERAL DO BRASIL - RFB, INSTITUTO BRASILEIRO DE GEOGRAFIA E ESTATÍSTICA - IBGE E AGÊNCIA NACIONAL DO PETRÓLEO, GÁS NATURAL E BIOCOMBUSTÍVEIS - ANP - PYTHON E POSTGRESQL |
| ahmedshahriar/bd-medicine-scraper |
25 |
|
0 |
0 |
over 2 years ago |
0 |
|
2 |
apache-2.0 |
Python |
| Scrapy-Django PostgreSQL integrated API with Proxy IP configuration that scrapes all medicine data (meds, prices, generics, companies, indications) from Bangladesh (30k+ pages) |
| redwanhuq/campaign-critic |
16 |
|
0 |
0 |
over 8 years ago |
0 |
|
0 |
|
Jupyter Notebook |
| A web app that helps Kickstarter creators increase their chances of being funded |
| brianamaral/covid-news |
16 |
|
0 |
0 |
almost 5 years ago |
0 |
|
0 |
|
Python |
| A data engineering personal project for applying some of my skills |
| datatogether/sentry |
14 |
|
0 |
0 |
over 7 years ago |
1 |
October 02, 2018 |
25 |
agpl-3.0 |
Go |
| Parallelized web crawler written in Golang |
| WeebSearch/worker |
13 |
|
0 |
0 |
about 7 years ago |
0 |
|
0 |
gpl-3.0 |
TypeScript |
| ⚒ Web crawler that analyzes and dissects subtitles into database entries |
| fooock/robots.txt |
13 |
|
0 |
0 |
over 5 years ago |
0 |
|
7 |
gpl-3.0 |
Java |
| :robot: robots.txt as a service. Crawls robots.txt files, downloads and parses them to check rules through an API |
| kitestring/DataQuest |
12 |
|
0 |
0 |
almost 8 years ago |
0 |
|
0 |
|
Jupyter Notebook |
| Data Science Massive Open Online Course: All the code, notes and supplementary materials generated during the course of my data scientific learning. |