Apache Airflow Integration for Baselinker & Discogs need Software Development
Contact person: Apache Airflow Integration for Baselinker & Discogs
Phone:Show
Email:Show
Location: Sutton, United Kingdom
Budget: Recommended by industry experts
Time to start: As soon as possible
Project description:
"Title: Baselinker & Discogs Integration with Apache Airflow
Project Description
We are looking for a skilled freelance developer to build a custom data pipeline using Apache Airflow. The goal is to create a robust and automated integration that synchronizes orders, products, and inventory between Baselinker and Discogs. This is a critical project to streamline our e-commerce operations.
The ideal candidate will have strong experience with API integration, as well as a deep understanding of Apache Airflow's core functionalities.
Important Note: As part of this project, the integration with the [login to view URL] system must be done through a special data exchange file. File example and documentation will be provided.
Key Responsibilities
Airflow DAG Development: Design, develop, and test a modular and scalable Apache Airflow DAG (Directed Acyclic Graph) for the Baselinker-Discogs integration.
API Integration: Create custom operators or Python scripts to interact with both the Baselinker and Discogs APIs for fetching and updating data.
Order Synchronization: Implement a process to regularly fetch new orders from Baselinker and push relevant information to Discogs.
Product & Inventory Management: Develop logic to synchronize product listings and update stock levels accurately between both platforms.
File-Based Integration: Create a process to generate a special data exchange file for [login to view URL], as required for that specific integration.
Error Handling: Implement robust error handling, logging, and retry mechanisms within the DAG to ensure data integrity and pipeline reliability.
Documentation: Provide detailed documentation for the Airflow DAG, including setup instructions, configuration requirements, and an overview of the data flow.
Required Skills & Experience
Apache Airflow: Proven expertise in designing, building, and deploying data pipelines with Apache Airflow.
Python: Excellent Python programming skills.
API Integration: Strong experience working with RESTful APIs.
Data Pipelines: Understanding of data pipeline best practices, including idempotency, monitoring, and scheduling.
File Handling: Experience with creating and manipulating structured data files (e.g., CSV, JSON, XML).
Git/GitHub: Essential experience with Git for version control, including a good understanding of branching, merging, and collaboration on GitHub.
Deliverables
A fully functional and well-documented Apache Airflow DAG in a Python file.
All necessary custom scripts and operators.
A [login to view URL] file listing all Python dependencies.
A clear README file with setup and usage instructions.
Budget & Timeline
Budget: [Please provide your budget]
Timeline: [Please provide your desired timeline]
Please submit your proposal with a brief summary of your relevant experience and any past projects that demonstrate your ability to complete this task." (client-provided description)
Matched companies (6)

Versasia Infosoft

Knowforth Tech

TechGigs LLP

SJ Solutions & Infotech

Kiantechwise Pvt. Ltd.
