Skip to main content

搜索职位

Database Engineer (Independent Contractor)

返回搜索
职位编号
J2435057
地点
Remote - Hungary; Remote - Poland; Remote - Ukraine
类别
信息技术
发布日期
01/26/2026
工作时间类型
全职

在捷普(NYSE:JBL),我们很自豪能够成为世界顶级品牌值得信赖的合作伙伴,提供综合的工程、制造和供应链解决方案。凭借60年的跨行业经验和遍布全球的100多个工厂,捷普将全球覆盖影响力与当地专业知识相结合,提供可扩展和定制化的解决方案。我们的承诺超越商业成功,致力于构建可持续的流程,最大限度减少环境影响,并促进全球不同社区的繁荣与多样。

We’re hiring a Database Engineer to design, build, and operate reliable data platforms and pipelines. You’ll focus on robust ETL/ELT workflows, scalable big data processing, and cloud-first architectures (Azure preferred) that power analytics and applications.

What You’ll Do

  • Design, build, and maintain ETL/ELT pipelines and data workflows (e.g., Azure Data Factory, Databricks, Spark, ClickHouse, Airflow, etc.).
  • Develop and optimize data models, data warehouse/lake/lakehouse schema (partitioning, indexing, clustering, cost/performance tuning, etc.).
  • Build scalable batch and streaming processing jobs (Spark/Databricks, Delta Lake; Kafka/Event Hubs a plus).
  • Ensure data quality, reliability, and observability (tests, monitoring, alerting, SLAs).
  • Implement CI/CD and version control for data assets and pipelines.
  • Secure data and environments (IAM/Entra ID, Key Vault, strong tenancy guarantees, encryption, least privilege).
  • Collaborate with application, analytics, and platform teams to deliver trustworthy, consumable datasets.

Required Qualifications

  • ETL or ELT experience required (ADF/Databricks/dbt/Airflow or similar).
  • Big data experience required.
  • Cloud experience required; Azure preferred (Synapse, Data Factory, Databricks, Azure Storage, Event Hubs, etc.).
  • Strong SQL and performance tuning expertise; hands-on with at least one warehouse/lakehouse (Synapse/Snowflake/BigQuery/Redshift or similar).
  • Solid data modeling fundamentals (star/snowflake schemas, normalization/denormalization, CDC, etc.).
  • Experience with CI/CD, Git, and infrastructure automation basics.
  • Work as an Independent Contractor

Nice to Have

  • Streaming pipelines (Kafka, Event Hubs, Kinesis, Pub/Sub) and exactly-once/at-least-once patterns.
  • Orchestration and workflow tools (Airflow, Prefect, Azure Data Factory).
  • Python for data engineering.
  • Data governance, lineage, and security best practices.
  • Infrastructure as Code (Terraform) for data platform provisioning.

Jabil, including its subsidiaries, is an equal opportunity employer and considers qualified applicants for employment without regard to race, color, religion, national origin, sex, age, disability, genetic information, veteran status, or any other characteristic protected by law.

BE AWARE OF FRAUD: When applying for a job at Jabil you will be contacted via correspondence through our official job portal with a jabil.com e-mail address; direct phone call from a member of the Jabil team; or direct e-mail with a jabil.com e-mail address. Jabil does not request payments for interviews or at any other point during the hiring process. Jabil will not ask for your personal identifying information such as a social security number, birth certificate, financial institution, driver’s license number or passport information over the phone or via e-mail. If you believe you are a victim of identity theft, contact your local police department. Any scam job listings should be reported to whatever website it was posted in.

Accommodation Statement

If you are a qualified individual with a disability, you have the right to request a reasonable accommodation if you are unable or limited in your ability to use or access Jabil.com/Careers site as a result of your disability. You can request a reasonable accommodation by sending an e-mail to Always_Accessible@Jabil.com with the nature of your request and contact information. Please do not direct any other general employment related questions to this e-mail. Please note that only those inquiries concerning a request for reasonable accommodation will be responded to.

类似职位