Czego od Ciebie oczekujemy? wykształcenia wyższego z zakresu biotechnologii, informatyki, matematyki lub pokrewnych wiedzy i doświadczenia w programowaniu w języku Python (szczególnie w projektach naukowych), znajomość odpowiednich struktur i na
2025-01-24 10:25:14Graylight Imaging Sp. zo.oWhat You’ll Need: 4+ years of experience in data engineering. Proficiency with AWS services like Glue, Redshift, and Athena. Experience with data storage solutions (S3, DynamoDB, or RDS). Familiarity with orchestration tools such as Airflow. Nic
2025-01-23 10:25:15KraftCodeYour responsibilities, Develop innovative solutions for complex data challenges, using creativity and technology., Design and build data systems that fulfill client requirements and optimize user experience., Collaborate across teams to integrat
2025-01-21 11:40:47Twój zakres obowiązków, Współpraca z Data Scientist, kierownikami produktów i innymi inżynierami w celu udoskonalenia i ulepszenia systemów przetwarzania danych,, Projektowanie i utrzymanie skalowalnych i zoptymalizowanych potoków danych w celu
2025-01-21 11:40:46Twój zakres obowiązków, Projektowanie i implementacja systemów kontrolingowych (Hurtownie Danych, Business Intelligence, Enterprise Performance Management)., Udział w projektach wdrożeniowych realizowanych dla Klientów firmy., Projektowanie i..
2025-01-21 10:40:45Minimum 4 lat doświadczenia w programowaniu w języku Python Praktyczna znajomość wzorców projektowych w Pythonie Umiejętność pisania testów jednostkowych/integracyjnych oraz end-to-end wraz z dobrymi praktykami, TDD Doświadczenie w projektowaniu
2025-01-21 10:25:14Respect Energy Holding S.A 3,7Wymagania techniczne: 5+ years in a senior developer role, with hands-on experience in building data processing pipelines. Proficiency with GCP services, especially BigQuery and BigQuery SQL, for large-scale data processing and optimization. Ext
2025-01-21 10:25:14emagine sp. z o.o 3,9Your responsibilities, We are currently looking for a Data Engineer who likes the sound of all that! In this role, you would be part of the Data & Analytics team and would evaluate requirements, design, code, and maintain high-performance data.
2025-01-21 09:40:37Twój zakres obowiązków, Rozwój hurtowni danych w GCP zgodnie z dobrymi praktykami i aktualnymi standardami., Budowanie i zarządzanie przepływami danych o bardzo dużych wolumenach., Utrzymywanie wysokiej jakości kodu poprzez pisanie testów..
2025-01-21 09:40:37The first map of the labor market in the IT sector. We want to simplify the search process to minimum
2025-01-21 07:56:05Silky Coders 5,0The first map of the labor market in the IT sector. We want to simplify the search process to minimum
2025-01-20 17:56:05LoopMeThe first map of the labor market in the IT sector. We want to simplify the search process to minimum
2025-01-20 16:56:04Michael Page International (Poland) Sp...The first map of the labor market in the IT sector. We want to simplify the search process to minimum
2025-01-20 16:56:04Michael Page International (Poland) Sp...The first map of the labor market in the IT sector. We want to simplify the search process to minimum
2025-01-20 16:56:04Michael Page International (Poland) Sp...The first map of the labor market in the IT sector. We want to simplify the search process to minimum
2025-01-20 15:56:05Kevin EdwardYour responsibilities, Perform data and application migrations from Microsoft Fabric to Databricks, ensuring minimal disruption and optimal performance, Provide technical expertise and experience in Databricks, including data engineering, data.
2025-01-20 15:40:39Your responsibilities, Designing new solutions and coming up with initiatives for improvements to existing solutions within data platforms - both as part of orders coming from the business (functional changes) and from technology (architectural
2025-01-20 15:40:39The first map of the labor market in the IT sector. We want to simplify the search process to minimum
2025-01-20 14:56:04SoftwareOneThe first map of the labor market in the IT sector. We want to simplify the search process to minimum
2025-01-20 14:56:04The CodestThe first map of the labor market in the IT sector. We want to simplify the search process to minimum
2025-01-20 14:56:04JIT TeamYour responsibilities, Translate business requirements into actionable data structures and technical specifications. You will collaborate in requirements gathering, data architecture, and information architecture, Design, create and maintain..
2025-01-20 14:40:46Your responsibilities, Designing, developing, and maintaining robust, scalable data pipelines., Collaborating closely with product managers, UX designers, data analysts and software engineers to understand their requirements and deliver high..
2025-01-20 14:40:46The first map of the labor market in the IT sector. We want to simplify the search process to minimum
2025-01-20 10:56:03Experis Manpower GroupYour skills At least 3-4 years of experience in programming Including at least 2 years of experience in Spark with Scala Experience in working with Big Data - Spark, Hadoop, Hive Experience with different database structures (Postgres, SQL, Hive
2025-01-20 10:25:18GFT Poland 3,1The first map of the labor market in the IT sector. We want to simplify the search process to minimum
2025-01-18 13:56:04Espeo SoftwareA bachelor’s degree in informatics, information systems, or a related discipline in IT or data science. At least 4 years of relevant professional experience in analytics, resource/performance management, or data science. Comprehensive expertise
2025-01-18 10:25:16Link GroupDbt certified Skilled in data processing Experience with data modeling and designing conceptual, logical and physical data models Design EL(T) data pipelines using SQL Is familiar with traditional DW relational concepts (Dimensions, Facts, star
2025-01-18 10:25:167N Sp. z o. o 5,0Must have’s: Demonstrated experience (min.4 years) in Data Engineering. Expertise with Azure and Databricks Data Platform technologies, including: Azure Data Lake Storage (ADLS Gen2), Azure SQL (PaaS) , Azure Data Factory (ADF) Azure Databricks
2025-01-18 10:25:16DevireYour responsibilities, Design, develop and maintain robust data pipelines using Python, Spark, Hadoop, SQL for batch and streaming data processing, Collaborate with cross-functional teams to understand data requirements and design efficient..
2025-01-18 00:40:05Your responsibilities, As a member of agile project teams, your mission will be to build solutions and infrastructure aiming at solving the business problems of our clients., Design, build, maintain, and troubleshoot data pipelines and processin
2025-01-17 16:40:41The first map of the labor market in the IT sector. We want to simplify the search process to minimum
2025-01-17 15:56:06Link GroupThe first map of the labor market in the IT sector. We want to simplify the search process to minimum
2025-01-17 13:56:04ITFS 4,8The first map of the labor market in the IT sector. We want to simplify the search process to minimum
2025-01-17 12:56:05CLOUDFIDEThe first map of the labor market in the IT sector. We want to simplify the search process to minimum
2025-01-17 12:56:05SoftKraftYour responsibilities, Develop a comprehensive technical migration plan involving data ingestion, transformation, storage, and access control using Azure Data Factory and data lake solutions., Build scalable and reusable frameworks for efficient
2025-01-17 12:40:50The first map of the labor market in the IT sector. We want to simplify the search process to minimum
2025-01-17 10:56:067N 5,0The first map of the labor market in the IT sector. We want to simplify the search process to minimum
2025-01-17 10:56:06Speechify IncThe first map of the labor market in the IT sector. We want to simplify the search process to minimum
2025-01-17 10:56:06ALIOR BANK S.AWymagania: Min. 4 lata doświadczenia jako Data Engineer / SQL Developer Znajomość Python, SQL w tym T-SQL, pisanie złożonych procedur składowanych, optymalizacja wydajności, Umiejętność tworzenia procesów ETL (
2025-01-17 10:56:06XTBThe first map of the labor market in the IT sector. We want to simplify the search process to minimum
2025-01-17 10:56:06XebiaTwój zakres obowiązków, Tworzenie i utrzymanie modeli danych na potrzeby importu danych do systemu księgowego oraz raportowania operacyjnego dla obszaru Księgowości i Ryzyka, Integracja danych w celu wytworzenia wymaganych modeli danych, Wdrażan
2025-01-17 09:40:43Data Engineer with Databricsk and PySpark - 100% remote Your responsibilities will include: - Leading the migration of data and applications from Microsoft Fabric to Databricks, ensuring minimal dis
2025-01-16 16:30:07Twój zakres obowiązków, Opracowanie technicznego planu migracji danych, obejmującego procesy ich pobierania, transformacji, przechowywania oraz zarządzania dostępem (Azure Data Factory, data lake),, Tworzenie skalowalnych i wielokrotnego użytku
2025-01-16 13:40:34Hays IT Contracting is a cooperation based on B2B rules. We connect IT specialists with the most interesting, technological projects on the market. Join the group of 500 satisfied Contractors working for Hays ’ clients!For our Client we are curr
2025-01-16 11:53:48Minimum 5 years in data engineering with demonstrated expertise in designing, building, and optimizing data workflows. Core Languages: Proficient in SQL, PySpark, and Python. Frameworks and Tools: Skilled in using Apache Airflow, AWS Glue, Kafka
2025-01-16 10:25:17DevireData Engineer Polska - Hays Poland - oferta pracy. Tą i setki innych ofert pracy znajdziesz w myCV.pl
2025-01-16 06:31:02Hays PolandData EngineerPolskaNR REF.: 1190922Hays IT Contracting is a cooperation based on B2B rules. We connect IT specialists with the most interesting, technological projects on the market. Join the group of 500 satisfied Contractors working for Hays’
2025-01-15 21:25:45The first map of the labor market in the IT sector. We want to simplify the search process to minimum
2025-01-15 18:56:07N-iX 2,7The first map of the labor market in the IT sector. We want to simplify the search process to minimum
2025-01-15 18:56:07GlobalLogic 4,5The DataOS team builds and supports a platform for processing and organizing data at AppsFlyer. We are creating a data lake that stores over 10PB of data and ingests more than 100TB daily. We are looking for passionate data engineers who thrive
2025-01-22 10:25:14AllSTARSITWhat's important for us? At least 7+ years of professional experience in data-related role Experience with infrastructure-as-code tools, like Terraform Proficiency in using Airflow Expertise in AWS stack and services (EC2, ELB, IAM, RDS, Route53
2025-01-24 10:25:14Sunscrapers