Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.
Experience Level
Entry Level
About the job
Join our dynamic team as a Danish-Speaking Content Moderator focused on an innovative social media AI platform in Athens, Greece. In this role, you will be instrumental in ensuring that content aligns with community standards and guidelines, maintaining a safe and engaging environment for users.
Your responsibilities will include reviewing, moderating, and curating user-generated content, utilizing advanced AI tools, and collaborating with cross-functional teams to enhance user experience.
Join our dynamic team as a Danish-Speaking Content Moderator focused on an innovative social media AI platform in Athens, Greece. In this role, you will be instrumental in ensuring that content aligns with community standards and guidelines, maintaining a safe and engaging environment for users.Your responsibilities will include reviewing, moderating, and cu…
D ONE is a leading consultancy firm specializing in data, artificial intelligence, and transformative solutions. We empower organizations to convert their aspirations into tangible results by aligning their strategies, establishing robust data infrastructures, delivering impactful solutions at scale, and ensuring governance and trust through a people-focused approach.With offices in Zurich, Munich, and Athens, we provide unparalleled support to clients across Europe, leveraging our extensive expertise throughout the data value chain. Our methodology is practical, centered around people, and designed for sustainable value creation.In your role as a DataOps Engineer in Athens, you will collaborate with client project teams to design, develop, and deploy high-performance cloud and data applications. Your work will be pivotal in constructing the software and data foundations necessary for data-driven value creation and the industrialization of AI across diverse industries.Key Responsibilities:Design and implement software and data applications with a strong emphasis on automation and scalability.Architect cloud infrastructure using Infrastructure as Code (Terraform, Terragrunt) within the Azure ecosystem.Build resilient data pipelines and lakehouse architectures utilizing Databricks, PySpark, and Unity Catalog.Establish CI/CD pipelines in Azure DevOps for efficient testing and delivery of both infrastructure and data workloads.Occasional travel to Switzerland for team collaborations at our headquarters in Zurich.
About Satori Analytics Satori Analytics is based in Athens and delivers Data and AI solutions to global brands. The team works across the full data lifecycle, from ingestion to AI applications, and builds technologies for industries such as fintech, airlines, FMCG, retail, manufacturing, and financial services. With more than 100 technology professionals, Satori Analytics is driving data innovation in South-Eastern Europe and beyond. Role Overview: Junior AI Engineer This role focuses on designing and implementing AI workflows and models, with a strong emphasis on Python and large language models (LLMs). The Junior AI Engineer works closely with data scientists, developers, and domain experts to deliver scalable AI solutions and translate business needs into actionable outcomes. What You Will Do Design and implement AI workflows and models: Use Python and LLMs to build scalable solutions that connect AI/ML with software engineering. Collaborate with multidisciplinary teams: Work alongside data scientists, developers, and subject matter experts to move projects forward and share knowledge. Research and apply new techniques: Stay current on AI trends and incorporate emerging methods into your work. Translate business problems: Turn complex business challenges into practical ML/AI solutions, and communicate results clearly. Document your work: Keep thorough, clear records so projects remain reproducible and easy for teammates to understand. Location Athens, Attica, Greece
Join Our Innovative Team at Kaizen GamingKaizen Gaming, the dynamic force behind Betano, stands as one of the premier GameTech companies globally, serving 19 diverse markets. Our mission is to harness advanced technology to deliver unparalleled entertainment experiences to millions of satisfied customers.Our vibrant workforce consists of over 2,700 talented individuals from more than 40 nationalities spanning three continents.We take pride in being recognized among the Best Workplaces in Europe and are certified as a Great Place to Work in all our offices. At Kaizen Gaming, every day is a new opportunity to excel. Are you ready to Press Play on your career potential?About the RoleAs a Lead Data Scientist, you will spearhead our AI initiatives by analyzing complex datasets and developing machine learning models that drive our innovative AI products. The ideal candidate will possess an in-depth knowledge of machine learning algorithms and a proven track record of deploying ML/AI applications in production environments.Key Responsibilities:Convert product specifications into actionable machine learning tasks and pinpoint high-impact AI opportunities.Conduct comprehensive data analysis to unearth vital patterns and derive actionable insights.Execute exploratory data analysis (EDA) and feature engineering to facilitate the modeling process.Implement best practices in model selection, parameter tuning, and validation.Conduct comparative experiments to enhance model training.Analyze machine learning metrics to assess various solution options.Oversee the complete lifecycle of AI features, from data collection through model design to implementation and optimization in live environments.Mentor junior team members, sharing expertise and leading intricate projects.
Working Model: Hybrid | Type: Full-timeAccepted Ltd. is a leading provider of software and digital transformation services, dedicated to accelerating innovation across various sectors including Finance, Energy, Gaming, and Telecommunications. With over 20 years of engineering excellence, we pride ourselves on creating outcome-focused solutions and assembling high-performing teams that integrate seamlessly with our clients' operations.We are seeking a Senior GRC & Privacy Engineer to bolster our hybrid delivery teams.Key ResponsibilitiesEngage in workshops and client meetings to identify needs related to Privacy, Data Governance, GRC, Security Assurance, Ethics & Compliance;Analyze client business processes and develop customized governance and compliance solutions;Design and configure workflows within GRC/Privacy platforms encompassing risk assessments, privacy operations, compliance controls, vendor risk, and ethics workflows;Conduct maturity assessments and provide actionable recommendations for privacy, GRC, and security compliance;Assist in pre-sales activities by preparing materials, conducting demos, and estimating project scope and effort;Create methodologies, templates, processes, and best practices for the service line;Help formulate policies, procedures, compliance frameworks, and risk registers;Mentor junior team members and uphold the quality of deliverables.
Join Kpler, where we simplify the complexities of global trade, providing insights that empower organizations in the commodities, energy, and maritime sectors. Since our inception in 2014, we've been committed to delivering premier intelligence through intuitive platforms. With a diverse team of over 700 experts from more than 35 countries, we transform intricate data into actionable strategies to help our clients thrive in a dynamic market landscape.We are on the lookout for a passionate and skilled Senior Data Analyst to join our Business Intelligence & Insights team. In this pivotal role, you will contribute to the data foundation that drives Kpler's commercial and strategic decision-making. Reporting to the Head of BI, you'll manage essential data pipelines, design scalable Looker solutions, and serve as a trusted advisor to stakeholders across the organization.This high-impact position is ideal for individuals who excel at the crossroads of data engineering and business analytics. One day, you’ll be constructing robust, production-ready infrastructure, and the next, you’ll be translating complex datasets into actionable insights for a commercial audience. If you’re excited about working with real-time commodity flow data and shaping the BI strategy of a rapidly growing B2B SaaS company, we want to hear from you.
Are you a driven Data Analyst with expertise in Power BI? Join our dynamic team to play a pivotal role in advancing e-Government initiatives! We are in search of a proficient Data Analyst - Power BI Expert to spearhead the design, analysis, and execution of data models, reports, and dashboards.Your Responsibilities: Engage with stakeholders to comprehend their business requirements and translate them into effective data-driven solutions. Craft and implement Power BI reports and dashboards that yield vital business insights. Develop and refine data models within Power BI to guarantee seamless data integration and optimize reporting performance. Create and manage Power BI Dataflows, ensuring efficient connection and transformation of data from diverse sources for reporting purposes. Write optimized SQL queries to extract, cleanse, and transform data from various databases like SQL Server and PostgreSQL, preparing datasets for analysis and reporting. Utilize DAX (Data Analysis Expressions) within Power BI to generate measures and calculated columns for comprehensive analysis. Publish and manage Power BI reports and dashboards in the Power BI Service, ensuring accessibility and usability for stakeholders. Oversee the organization of reports within Power BI workspaces, managing permissions and access rights for users. Troubleshoot and enhance Power BI reports to ensure accuracy, reliability, and optimal performance. Required Qualifications: Bachelor's degree in Computer Science, Information Technology, Data Science, Software Engineering, Electrical Engineering, or a related field. A minimum of 2 years' experience in business intelligence or data analysis with a strong focus on Power BI. Proficiency in DAX for calculations and data modeling. Experience with Power Query for data transformation and cleansing. Proficient in publishing, sharing, and managing reports in the Power BI Service, including workspace administration. Solid understanding of data visualization principles and best practices. Knowledge of relational databases and SQL for data retrieval. Exceptional analytical, problem-solving, and communication skills. Fluent in both spoken and written English. Preferred Qualifications: Experience integrating Power BI with other Microsoft tools (e.g., Excel, Azure) and third-party applications. Ability to leverage Power BI’s AI capabilities for advanced analytics solutions, including machine learning models and automated insights. Familiarity with popular machine learning frameworks and libraries (e.g., TensorFlow, Scikit-learn, PyTorch) for building and deploying models. Proficiency in data preprocessing and cleansing for machine learning, including handling missing values, data normalization, and feature engineering. Understanding of data warehousing concepts.
Join Our Team as a Field Data Collection Specialist / Surveyor!At Terry Soot Management Group (TSMG), we specialize in field data collection where automation is not feasible. Founded in 2017, we operate across Europe and North America, providing essential data that helps clients make informed decisions. Our dedicated teams use advanced techniques to capture features, images, videos, audio, and detailed scans of various locations.Project Overview:We are embarking on a project aimed at enhancing online mapping applications through the collection of visual and geo data.Role Responsibilities:As a Field Data Collection Specialist, you will be equipped with a specialized camera backpack and tasked with traversing designated areas to capture high-quality footage of the surroundings. Your role is crucial in enabling improvements in mapping accuracy and detail.Work Schedule:Workdays are Monday through Friday, with daily shifts ranging from 6 to 8 hours.
Join RWS Group as an AI Data Specialist, where you will play a pivotal role in supporting AI and machine learning projects by managing and analyzing linguistic data in Greek. This is an exciting opportunity to contribute to innovative technologies and enhance AI processes.
Optasia is a B2B2X financial technology platform focused on scoring, financial decision-making, disbursement, and collection. The company’s mission centers on advancing financial inclusion worldwide and reshaping the financial sector through a distinct approach. Role Overview The Quantitative Risk Data Scientist will join the Credit Portfolio Optimization team in Athens. This position sits at the intersection of risk management, research, and technology, contributing to algorithmic trading and portfolio optimization projects. The role involves working alongside traders, big data experts, and machine learning engineers to support real-time decision-making and maintain strong system performance. What You Will Do Design and implement algorithmic solutions to maximize revenue through detailed credit risk analysis. Extract actionable insights on credit risk using advanced big data analytics. Develop tools and procedures for portfolio risk assessment to monitor and manage credit risk effectively. Conduct thorough risk analyses of microloans and other financial products, refining risk models to improve decision quality. Identify and evaluate credit risk factors by applying advanced computational techniques to large datasets. Build predictive models using statistical and machine learning methods to strengthen risk management strategies. Collaborate closely with data scientists and machine learning engineers. Continuously update risk assessment methods in response to changing market conditions. What We Look For Bachelor’s or Master’s degree in Data Science, Statistics, Finance, Mathematics, or a related discipline. 2-5 years of experience in quantitative risk analysis, preferably within financial services. Demonstrated skill in developing and applying algorithmic models for revenue and risk optimization. Strong background in statistical modeling and experience with machine learning models. Proficiency in Python or R, and hands-on experience with big data risk analytics. Track record of designing and deploying portfolio risk assessment tools. Excellent problem-solving skills and attention to detail when working with complex datasets. Key Attributes Sound judgment and strong analytical thinking. Self-driven, resourceful, and comfortable working independently.
Satori Analytics develops data and AI solutions for clients in industries such as FMCG, retail, manufacturing, and financial services. With a team of over 100 technology professionals based in Athens, the company manages projects that span the entire data lifecycle, from ingestion through to deploying AI applications. Projects range from building cloud infrastructure for fintech clients to delivering predictive analytics for airlines. Role overview The AI Engineer designs and builds AI workflows and models, focusing on Python and large language models (LLMs). This role involves close collaboration with data scientists, developers, and domain experts to turn business needs into practical AI and machine learning solutions. The position combines research, hands-on engineering, and teamwork within a growing group. What you will do Design and implement AI models and workflows using Python, with a particular focus on large language models (LLMs). Collaborate with data scientists, software engineers, and subject matter experts to exchange ideas, review progress, and solve challenges together. Research new trends in AI and machine learning, applying relevant techniques to current projects. Translate business challenges into actionable machine learning or AI solutions, and communicate results clearly to stakeholders. Document code and processes to support clarity and reproducibility for the team. Location This position is based in Athens, Attica, Greece.
Join our team as a Senior Backend Engineer (Python) and engage in a challenging project that focuses on data-intensive processing, system integration, and semantic technologies. In this pivotal role, you will take charge of an established codebase, enhancing its stability and evolving it to meet the project’s demands. This position is perfect for seasoned engineers who thrive in working with complex systems and have a solid grasp of data-driven architectures.Key Responsibilities:Thoroughly analyze and take ownership of an existing Python-based backend.Refactor and optimize backend components with an emphasis on maintainability and performance.Develop and sustain data processing pipelines and integration workflows.Collaborate closely with data and semantic engineers on RDF/SPARQL-driven processes.Contribute to the architectural redesign and technical documentation.Assist with deployment, configuration, and troubleshooting tasks.Ensure high code quality through rigorous reviews, testing, and adherence to best practices.Requirements:A minimum of 3 years of professional experience in backend development.Expertise in Python programming.Experience with:Apache Airflow and AWS.Data-intensive or integration-heavy systems.APIs, batch processing, and backend services.Configuration-driven systems (XML / JSON / YAML).Strong understanding of:Software architecture and design patterns.Debugging and maintaining legacy codebases.Proven experience in complex, multi-stakeholder projects (experience in EU or public-sector projects is a plus).Preferred Qualifications:Familiarity with Semantic Web technologies (RDF, SPARQL, OWL).Experience in data modeling or knowledge-based systems.Exposure to DevOps practices (CI/CD, containerization).Experience in contributing to or maintaining technical documentation (e.g., AsciiDoc, Antora).Benefits:We value talent and commitment, offering the following perks:Attractive full-time salary.Private health insurance under the company’s group plan.Flexible working hours.Access to top-quality tools.Opportunities for professional development, including language courses and specialized training.Career advancement potential by collaborating with leading specialists in the field.A dynamic work environment that encourages personal and professional growth.If you're ready for an exciting challenge, we would love to hear from you!
Optasia is an innovative B2B2X financial technology platform dedicated to enhancing scoring, financial decision-making, disbursement, and collection processes. Our mission is to foster financial inclusion for everyone. Join us in transforming the financial landscape!We are on the lookout for passionate and driven professionals who thrive in a collaborative environment. As a member of our dynamic team, you will contribute to delivering innovative solutions that make a difference.Data plays a pivotal role in Optasia's growth strategy, with our ML Engineering team being a key contributor. We harness data from various sources into our extensive big data clusters and develop and manage multiple analytical pipelines using state-of-the-art big data technologies.As a Senior ML Infrastructure Engineer, you will play a crucial role in enhancing Optasia's data-driven decision-making and credit risk management by developing and optimizing scalable, end-to-end ML pipelines. Your key responsibilities will include: (i) building robust ML pipelines, (ii) designing statistical and machine learning algorithms, and (iii) operationalizing these solutions to bolster credit risk management, directly impacting Optasia's success.Your Responsibilities:Provide technical guidance in ML engineering to ensure the adoption of optimal tools and methodologies, staying ahead of emerging trends and delivering industry-leading solutions.Enhance the scalability, stability, accuracy, speed, and efficiency of ML workflows while maintaining stringent testing and code quality standards.Contribute to the design and development of microservices and tools that facilitate the Machine Learning lifecycle at Optasia.Collaborate on the design and implementation of scalable, real-time microservices utilized globally.Foster continuous improvements in the development lifecycle with the team.Design, develop, and maintain large-scale Spark jobs using PySpark and Scala.Build and manage CI/CD pipelines with Jenkins.Create automation scripts using Python or Bash.Develop and deploy scalable Airflow pipelines to support the Machine Learning lifecycle.Conduct data exploration and analysis to scope, build, and refine Machine Learning proof-of-concepts (PoCs).Partner with Engineers and the Credit Risk team to design and implement solutions that deliver business value at Optasia.Optimize the codebase through Spark job tuning and refactoring.Drive enhancements to our feature engineering engine for improved efficiency.
Join Kaizen GamingAt Kaizen Gaming, we are at the forefront of the GameTech industry, operating in 20 markets and delivering exceptional entertainment experiences to millions of customers. Our team of over 2,700 professionals hails from more than 40 nationalities across three continents, making us a truly diverse and innovative workplace.We are proud to be recognized as one of the Best Workplaces in Europe, certified as a Great Place to Work in our offices. Here, every day is unique and filled with opportunities to grow. Are you ready to 'Press Play' on your potential?About the RoleAs the Data Science Team Lead, you will spearhead an AI product team dedicated to developing impactful AI solutions. Your role will involve analyzing data, designing machine learning models, and collaborating with technology teams to launch successful AI products.The ideal candidate will possess a robust understanding of machine learning algorithms, extensive experience in building production-ready ML/AI applications, and a proven track record of leading AI teams.Your ResponsibilitiesLead and manage a team of 5 to 7 talented data scientists and machine learning engineers.Translate product requirements into machine learning challenges while identifying key areas for AI-driven business impact.Conduct data analysis to uncover significant patterns and insights.Perform exploratory data analysis (EDA) and feature engineering to enhance the modeling process.Implement best practices in model selection, optimization, and tuning.Execute comparative experiments for model training.Evaluate various potential solutions through ML metrics analysis.Oversee the complete lifecycle of AI features, from data collection to design, implementation, and optimization in production environments.Mentor junior team members and guide the team through complex projects.
Join Our Team at Kaizen Gaming!Kaizen Gaming, the driving force behind Betano, is a leading GameTech company with operations in 20 global markets. We are dedicated to harnessing the latest technology to deliver exceptional entertainment experiences to millions of our valued customers.Our diverse workforce consists of over 2,700 Kaizeners from more than 40 nationalities across three continents. We take pride in being recognized as one of the Best Workplaces in Europe and as a certified Great Place to Work. Here, each day offers unique challenges and opportunities. Are you ready to unleash your potential? About the RoleAs a Senior AI Data Governance Analyst, you will play an essential role within our Data/AI Governance team, contributing to the design and execution of a comprehensive AI governance framework across the organization. Collaborating with both business and technical stakeholders, you will ensure the integrity, consistency, availability, and compliance of AI models and applications, which is vital in the highly regulated betting and gaming industry. Key Responsibilities:Develop and enforce AI governance policies, standards, and procedures.Facilitate the implementation of AI metadata management, data lineage, and quality frameworks.Profile datasets to evaluate their accuracy, completeness, consistency, and integrity for effective AI utilization.Identify, document AI quality issues, and recommend corrective measures.Monitor and analyze AI quality KPIs, supporting remediation initiatives.Maintain AI quality dashboards and reports to communicate trends to stakeholders.Engage in cross-functional meetings to align AI governance priorities.Enhance AI business glossaries, data dictionaries, and catalogs.Collaborate with AI/Data owners, stewards, and custodians to promote governance best practices.Assist with compliance-related AI & Data initiatives (e.g., EU AI Act, GDPR, AML, Responsible Gaming reporting).Evaluate and support AI/Data governance tools (e.g., Atlan, Collibra, Informatica, Microsoft Purview).Contribute to the continuous evolution of the AI governance framework and best practices.
Join Bioiatriki as a Data Entry Clerk, where you will play a crucial role in ensuring the accuracy and efficiency of our data management systems. This position requires a keen eye for detail and the ability to manage large volumes of information with precision.Your responsibilities will include entering and updating data in our systems, verifying data accuracy, and assisting in administrative tasks as needed. If you have strong organizational skills and are comfortable working with data, we want to hear from you!
About Us: At tbibankgr, we are a dynamic challenger bank situated in Southeast Europe, recognized as a regional leader in alternative payment solutions. Our innovative ecosystem merges financing and shopping to meet the diverse needs of our customers. With a successful and profitable business model, we are proud to serve clients in Romania, Germany, Bulgaria, and Lithuania.In our quest to expand our global presence, we have established a significant footprint in Greece, collaborating with thousands of merchants and consumers. Are you ready to contribute to our unique success story?Your Role: We are seeking a Senior Data Analyst to join our dedicated team in Greece!Key Responsibilities:Conduct data discovery to identify insights.Collaborate with partners to explore potential data sources and develop data interpretation logic.Acquire data from both primary and secondary sources.Analyze and interpret data, providing regular and ad-hoc reports and insights.Engage in the continuous development of our Data Warehouse (DWH) facilities.Work with extensive datasets from multiple sources to create cohesive data reports.Utilize data manipulation and transformation tools effectively.
Role overview d-one seeks a Senior Databricks Engineer in Athens, Greece. The focus is on developing and enhancing data solutions that help meet business objectives. Candidates should bring strong hands-on experience with Databricks and a solid track record in designing, implementing, and improving analytics infrastructure. What you will do Design and build data pipelines and solutions with Databricks Collaborate with cross-functional teams to gather requirements and turn them into technical deliverables Develop and maintain ETL processes Optimize data workflows for better performance and scalability Contribute to the ongoing development of the company’s data strategy Location This role is based in Athens, Attica, Greece.
Join Netcompany1 as a Lead Business Intelligence (BI) Engineer and take charge of designing and implementing robust BI solutions. You will lead a team of BI developers and analysts, ensuring high-quality data analysis that drives decision-making across the organization. This role requires a blend of technical expertise and leadership skills to manage BI projects effectively.
Elevate Your Career with Bally’s Intralot as a Business Intelligence & Reporting Engineer!We invite you to become part of our dynamic team at Bally’s Intralot, where you will play a pivotal role in revolutionizing the gaming industry. As a Business Intelligence & Reporting Engineer, your expertise will directly influence the development and implementation of essential reporting solutions that enhance operational efficiency and empower data-driven decisions across our international teams.Key Responsibilities:Create and maintain efficient data pipelines for seamless extraction, transformation, and loading of information from diverse data sources.Compile and manage extensive, complex datasets that fulfill both functional and non-functional business requirements.Identify, design, and implement improvements to internal processes, including automation of manual tasks, optimization of data delivery, and infrastructure redesign for enhanced scalability.Develop operational and non-operational reports utilizing SQL to meet evolving business demands.Qualifications:Bachelor’s degree in Computer Science, Software Engineering, or a related technical field.Minimum of 2 years of professional experience in designing and developing automated reporting solutions.Proficient in relational databases such as MySQL and MS SQL Server.Familiarity with reporting design tools, including Jasper Reports, Power BI, and Tableau.Experience with Databricks is a plus.Excellent command of both written and verbal English and Greek.
Join our dynamic team as a Danish-Speaking Content Moderator focused on an innovative social media AI platform in Athens, Greece. In this role, you will be instrumental in ensuring that content aligns with community standards and guidelines, maintaining a safe and engaging environment for users.Your responsibilities will include reviewing, moderating, and cu…
D ONE is a leading consultancy firm specializing in data, artificial intelligence, and transformative solutions. We empower organizations to convert their aspirations into tangible results by aligning their strategies, establishing robust data infrastructures, delivering impactful solutions at scale, and ensuring governance and trust through a people-focused approach.With offices in Zurich, Munich, and Athens, we provide unparalleled support to clients across Europe, leveraging our extensive expertise throughout the data value chain. Our methodology is practical, centered around people, and designed for sustainable value creation.In your role as a DataOps Engineer in Athens, you will collaborate with client project teams to design, develop, and deploy high-performance cloud and data applications. Your work will be pivotal in constructing the software and data foundations necessary for data-driven value creation and the industrialization of AI across diverse industries.Key Responsibilities:Design and implement software and data applications with a strong emphasis on automation and scalability.Architect cloud infrastructure using Infrastructure as Code (Terraform, Terragrunt) within the Azure ecosystem.Build resilient data pipelines and lakehouse architectures utilizing Databricks, PySpark, and Unity Catalog.Establish CI/CD pipelines in Azure DevOps for efficient testing and delivery of both infrastructure and data workloads.Occasional travel to Switzerland for team collaborations at our headquarters in Zurich.
About Satori Analytics Satori Analytics is based in Athens and delivers Data and AI solutions to global brands. The team works across the full data lifecycle, from ingestion to AI applications, and builds technologies for industries such as fintech, airlines, FMCG, retail, manufacturing, and financial services. With more than 100 technology professionals, Satori Analytics is driving data innovation in South-Eastern Europe and beyond. Role Overview: Junior AI Engineer This role focuses on designing and implementing AI workflows and models, with a strong emphasis on Python and large language models (LLMs). The Junior AI Engineer works closely with data scientists, developers, and domain experts to deliver scalable AI solutions and translate business needs into actionable outcomes. What You Will Do Design and implement AI workflows and models: Use Python and LLMs to build scalable solutions that connect AI/ML with software engineering. Collaborate with multidisciplinary teams: Work alongside data scientists, developers, and subject matter experts to move projects forward and share knowledge. Research and apply new techniques: Stay current on AI trends and incorporate emerging methods into your work. Translate business problems: Turn complex business challenges into practical ML/AI solutions, and communicate results clearly. Document your work: Keep thorough, clear records so projects remain reproducible and easy for teammates to understand. Location Athens, Attica, Greece
Join Our Innovative Team at Kaizen GamingKaizen Gaming, the dynamic force behind Betano, stands as one of the premier GameTech companies globally, serving 19 diverse markets. Our mission is to harness advanced technology to deliver unparalleled entertainment experiences to millions of satisfied customers.Our vibrant workforce consists of over 2,700 talented individuals from more than 40 nationalities spanning three continents.We take pride in being recognized among the Best Workplaces in Europe and are certified as a Great Place to Work in all our offices. At Kaizen Gaming, every day is a new opportunity to excel. Are you ready to Press Play on your career potential?About the RoleAs a Lead Data Scientist, you will spearhead our AI initiatives by analyzing complex datasets and developing machine learning models that drive our innovative AI products. The ideal candidate will possess an in-depth knowledge of machine learning algorithms and a proven track record of deploying ML/AI applications in production environments.Key Responsibilities:Convert product specifications into actionable machine learning tasks and pinpoint high-impact AI opportunities.Conduct comprehensive data analysis to unearth vital patterns and derive actionable insights.Execute exploratory data analysis (EDA) and feature engineering to facilitate the modeling process.Implement best practices in model selection, parameter tuning, and validation.Conduct comparative experiments to enhance model training.Analyze machine learning metrics to assess various solution options.Oversee the complete lifecycle of AI features, from data collection through model design to implementation and optimization in live environments.Mentor junior team members, sharing expertise and leading intricate projects.
Working Model: Hybrid | Type: Full-timeAccepted Ltd. is a leading provider of software and digital transformation services, dedicated to accelerating innovation across various sectors including Finance, Energy, Gaming, and Telecommunications. With over 20 years of engineering excellence, we pride ourselves on creating outcome-focused solutions and assembling high-performing teams that integrate seamlessly with our clients' operations.We are seeking a Senior GRC & Privacy Engineer to bolster our hybrid delivery teams.Key ResponsibilitiesEngage in workshops and client meetings to identify needs related to Privacy, Data Governance, GRC, Security Assurance, Ethics & Compliance;Analyze client business processes and develop customized governance and compliance solutions;Design and configure workflows within GRC/Privacy platforms encompassing risk assessments, privacy operations, compliance controls, vendor risk, and ethics workflows;Conduct maturity assessments and provide actionable recommendations for privacy, GRC, and security compliance;Assist in pre-sales activities by preparing materials, conducting demos, and estimating project scope and effort;Create methodologies, templates, processes, and best practices for the service line;Help formulate policies, procedures, compliance frameworks, and risk registers;Mentor junior team members and uphold the quality of deliverables.
Join Kpler, where we simplify the complexities of global trade, providing insights that empower organizations in the commodities, energy, and maritime sectors. Since our inception in 2014, we've been committed to delivering premier intelligence through intuitive platforms. With a diverse team of over 700 experts from more than 35 countries, we transform intricate data into actionable strategies to help our clients thrive in a dynamic market landscape.We are on the lookout for a passionate and skilled Senior Data Analyst to join our Business Intelligence & Insights team. In this pivotal role, you will contribute to the data foundation that drives Kpler's commercial and strategic decision-making. Reporting to the Head of BI, you'll manage essential data pipelines, design scalable Looker solutions, and serve as a trusted advisor to stakeholders across the organization.This high-impact position is ideal for individuals who excel at the crossroads of data engineering and business analytics. One day, you’ll be constructing robust, production-ready infrastructure, and the next, you’ll be translating complex datasets into actionable insights for a commercial audience. If you’re excited about working with real-time commodity flow data and shaping the BI strategy of a rapidly growing B2B SaaS company, we want to hear from you.
Are you a driven Data Analyst with expertise in Power BI? Join our dynamic team to play a pivotal role in advancing e-Government initiatives! We are in search of a proficient Data Analyst - Power BI Expert to spearhead the design, analysis, and execution of data models, reports, and dashboards.Your Responsibilities: Engage with stakeholders to comprehend their business requirements and translate them into effective data-driven solutions. Craft and implement Power BI reports and dashboards that yield vital business insights. Develop and refine data models within Power BI to guarantee seamless data integration and optimize reporting performance. Create and manage Power BI Dataflows, ensuring efficient connection and transformation of data from diverse sources for reporting purposes. Write optimized SQL queries to extract, cleanse, and transform data from various databases like SQL Server and PostgreSQL, preparing datasets for analysis and reporting. Utilize DAX (Data Analysis Expressions) within Power BI to generate measures and calculated columns for comprehensive analysis. Publish and manage Power BI reports and dashboards in the Power BI Service, ensuring accessibility and usability for stakeholders. Oversee the organization of reports within Power BI workspaces, managing permissions and access rights for users. Troubleshoot and enhance Power BI reports to ensure accuracy, reliability, and optimal performance. Required Qualifications: Bachelor's degree in Computer Science, Information Technology, Data Science, Software Engineering, Electrical Engineering, or a related field. A minimum of 2 years' experience in business intelligence or data analysis with a strong focus on Power BI. Proficiency in DAX for calculations and data modeling. Experience with Power Query for data transformation and cleansing. Proficient in publishing, sharing, and managing reports in the Power BI Service, including workspace administration. Solid understanding of data visualization principles and best practices. Knowledge of relational databases and SQL for data retrieval. Exceptional analytical, problem-solving, and communication skills. Fluent in both spoken and written English. Preferred Qualifications: Experience integrating Power BI with other Microsoft tools (e.g., Excel, Azure) and third-party applications. Ability to leverage Power BI’s AI capabilities for advanced analytics solutions, including machine learning models and automated insights. Familiarity with popular machine learning frameworks and libraries (e.g., TensorFlow, Scikit-learn, PyTorch) for building and deploying models. Proficiency in data preprocessing and cleansing for machine learning, including handling missing values, data normalization, and feature engineering. Understanding of data warehousing concepts.
Join Our Team as a Field Data Collection Specialist / Surveyor!At Terry Soot Management Group (TSMG), we specialize in field data collection where automation is not feasible. Founded in 2017, we operate across Europe and North America, providing essential data that helps clients make informed decisions. Our dedicated teams use advanced techniques to capture features, images, videos, audio, and detailed scans of various locations.Project Overview:We are embarking on a project aimed at enhancing online mapping applications through the collection of visual and geo data.Role Responsibilities:As a Field Data Collection Specialist, you will be equipped with a specialized camera backpack and tasked with traversing designated areas to capture high-quality footage of the surroundings. Your role is crucial in enabling improvements in mapping accuracy and detail.Work Schedule:Workdays are Monday through Friday, with daily shifts ranging from 6 to 8 hours.
Join RWS Group as an AI Data Specialist, where you will play a pivotal role in supporting AI and machine learning projects by managing and analyzing linguistic data in Greek. This is an exciting opportunity to contribute to innovative technologies and enhance AI processes.
Optasia is a B2B2X financial technology platform focused on scoring, financial decision-making, disbursement, and collection. The company’s mission centers on advancing financial inclusion worldwide and reshaping the financial sector through a distinct approach. Role Overview The Quantitative Risk Data Scientist will join the Credit Portfolio Optimization team in Athens. This position sits at the intersection of risk management, research, and technology, contributing to algorithmic trading and portfolio optimization projects. The role involves working alongside traders, big data experts, and machine learning engineers to support real-time decision-making and maintain strong system performance. What You Will Do Design and implement algorithmic solutions to maximize revenue through detailed credit risk analysis. Extract actionable insights on credit risk using advanced big data analytics. Develop tools and procedures for portfolio risk assessment to monitor and manage credit risk effectively. Conduct thorough risk analyses of microloans and other financial products, refining risk models to improve decision quality. Identify and evaluate credit risk factors by applying advanced computational techniques to large datasets. Build predictive models using statistical and machine learning methods to strengthen risk management strategies. Collaborate closely with data scientists and machine learning engineers. Continuously update risk assessment methods in response to changing market conditions. What We Look For Bachelor’s or Master’s degree in Data Science, Statistics, Finance, Mathematics, or a related discipline. 2-5 years of experience in quantitative risk analysis, preferably within financial services. Demonstrated skill in developing and applying algorithmic models for revenue and risk optimization. Strong background in statistical modeling and experience with machine learning models. Proficiency in Python or R, and hands-on experience with big data risk analytics. Track record of designing and deploying portfolio risk assessment tools. Excellent problem-solving skills and attention to detail when working with complex datasets. Key Attributes Sound judgment and strong analytical thinking. Self-driven, resourceful, and comfortable working independently.
Satori Analytics develops data and AI solutions for clients in industries such as FMCG, retail, manufacturing, and financial services. With a team of over 100 technology professionals based in Athens, the company manages projects that span the entire data lifecycle, from ingestion through to deploying AI applications. Projects range from building cloud infrastructure for fintech clients to delivering predictive analytics for airlines. Role overview The AI Engineer designs and builds AI workflows and models, focusing on Python and large language models (LLMs). This role involves close collaboration with data scientists, developers, and domain experts to turn business needs into practical AI and machine learning solutions. The position combines research, hands-on engineering, and teamwork within a growing group. What you will do Design and implement AI models and workflows using Python, with a particular focus on large language models (LLMs). Collaborate with data scientists, software engineers, and subject matter experts to exchange ideas, review progress, and solve challenges together. Research new trends in AI and machine learning, applying relevant techniques to current projects. Translate business challenges into actionable machine learning or AI solutions, and communicate results clearly to stakeholders. Document code and processes to support clarity and reproducibility for the team. Location This position is based in Athens, Attica, Greece.
Join our team as a Senior Backend Engineer (Python) and engage in a challenging project that focuses on data-intensive processing, system integration, and semantic technologies. In this pivotal role, you will take charge of an established codebase, enhancing its stability and evolving it to meet the project’s demands. This position is perfect for seasoned engineers who thrive in working with complex systems and have a solid grasp of data-driven architectures.Key Responsibilities:Thoroughly analyze and take ownership of an existing Python-based backend.Refactor and optimize backend components with an emphasis on maintainability and performance.Develop and sustain data processing pipelines and integration workflows.Collaborate closely with data and semantic engineers on RDF/SPARQL-driven processes.Contribute to the architectural redesign and technical documentation.Assist with deployment, configuration, and troubleshooting tasks.Ensure high code quality through rigorous reviews, testing, and adherence to best practices.Requirements:A minimum of 3 years of professional experience in backend development.Expertise in Python programming.Experience with:Apache Airflow and AWS.Data-intensive or integration-heavy systems.APIs, batch processing, and backend services.Configuration-driven systems (XML / JSON / YAML).Strong understanding of:Software architecture and design patterns.Debugging and maintaining legacy codebases.Proven experience in complex, multi-stakeholder projects (experience in EU or public-sector projects is a plus).Preferred Qualifications:Familiarity with Semantic Web technologies (RDF, SPARQL, OWL).Experience in data modeling or knowledge-based systems.Exposure to DevOps practices (CI/CD, containerization).Experience in contributing to or maintaining technical documentation (e.g., AsciiDoc, Antora).Benefits:We value talent and commitment, offering the following perks:Attractive full-time salary.Private health insurance under the company’s group plan.Flexible working hours.Access to top-quality tools.Opportunities for professional development, including language courses and specialized training.Career advancement potential by collaborating with leading specialists in the field.A dynamic work environment that encourages personal and professional growth.If you're ready for an exciting challenge, we would love to hear from you!
Optasia is an innovative B2B2X financial technology platform dedicated to enhancing scoring, financial decision-making, disbursement, and collection processes. Our mission is to foster financial inclusion for everyone. Join us in transforming the financial landscape!We are on the lookout for passionate and driven professionals who thrive in a collaborative environment. As a member of our dynamic team, you will contribute to delivering innovative solutions that make a difference.Data plays a pivotal role in Optasia's growth strategy, with our ML Engineering team being a key contributor. We harness data from various sources into our extensive big data clusters and develop and manage multiple analytical pipelines using state-of-the-art big data technologies.As a Senior ML Infrastructure Engineer, you will play a crucial role in enhancing Optasia's data-driven decision-making and credit risk management by developing and optimizing scalable, end-to-end ML pipelines. Your key responsibilities will include: (i) building robust ML pipelines, (ii) designing statistical and machine learning algorithms, and (iii) operationalizing these solutions to bolster credit risk management, directly impacting Optasia's success.Your Responsibilities:Provide technical guidance in ML engineering to ensure the adoption of optimal tools and methodologies, staying ahead of emerging trends and delivering industry-leading solutions.Enhance the scalability, stability, accuracy, speed, and efficiency of ML workflows while maintaining stringent testing and code quality standards.Contribute to the design and development of microservices and tools that facilitate the Machine Learning lifecycle at Optasia.Collaborate on the design and implementation of scalable, real-time microservices utilized globally.Foster continuous improvements in the development lifecycle with the team.Design, develop, and maintain large-scale Spark jobs using PySpark and Scala.Build and manage CI/CD pipelines with Jenkins.Create automation scripts using Python or Bash.Develop and deploy scalable Airflow pipelines to support the Machine Learning lifecycle.Conduct data exploration and analysis to scope, build, and refine Machine Learning proof-of-concepts (PoCs).Partner with Engineers and the Credit Risk team to design and implement solutions that deliver business value at Optasia.Optimize the codebase through Spark job tuning and refactoring.Drive enhancements to our feature engineering engine for improved efficiency.
Join Kaizen GamingAt Kaizen Gaming, we are at the forefront of the GameTech industry, operating in 20 markets and delivering exceptional entertainment experiences to millions of customers. Our team of over 2,700 professionals hails from more than 40 nationalities across three continents, making us a truly diverse and innovative workplace.We are proud to be recognized as one of the Best Workplaces in Europe, certified as a Great Place to Work in our offices. Here, every day is unique and filled with opportunities to grow. Are you ready to 'Press Play' on your potential?About the RoleAs the Data Science Team Lead, you will spearhead an AI product team dedicated to developing impactful AI solutions. Your role will involve analyzing data, designing machine learning models, and collaborating with technology teams to launch successful AI products.The ideal candidate will possess a robust understanding of machine learning algorithms, extensive experience in building production-ready ML/AI applications, and a proven track record of leading AI teams.Your ResponsibilitiesLead and manage a team of 5 to 7 talented data scientists and machine learning engineers.Translate product requirements into machine learning challenges while identifying key areas for AI-driven business impact.Conduct data analysis to uncover significant patterns and insights.Perform exploratory data analysis (EDA) and feature engineering to enhance the modeling process.Implement best practices in model selection, optimization, and tuning.Execute comparative experiments for model training.Evaluate various potential solutions through ML metrics analysis.Oversee the complete lifecycle of AI features, from data collection to design, implementation, and optimization in production environments.Mentor junior team members and guide the team through complex projects.
Join Our Team at Kaizen Gaming!Kaizen Gaming, the driving force behind Betano, is a leading GameTech company with operations in 20 global markets. We are dedicated to harnessing the latest technology to deliver exceptional entertainment experiences to millions of our valued customers.Our diverse workforce consists of over 2,700 Kaizeners from more than 40 nationalities across three continents. We take pride in being recognized as one of the Best Workplaces in Europe and as a certified Great Place to Work. Here, each day offers unique challenges and opportunities. Are you ready to unleash your potential? About the RoleAs a Senior AI Data Governance Analyst, you will play an essential role within our Data/AI Governance team, contributing to the design and execution of a comprehensive AI governance framework across the organization. Collaborating with both business and technical stakeholders, you will ensure the integrity, consistency, availability, and compliance of AI models and applications, which is vital in the highly regulated betting and gaming industry. Key Responsibilities:Develop and enforce AI governance policies, standards, and procedures.Facilitate the implementation of AI metadata management, data lineage, and quality frameworks.Profile datasets to evaluate their accuracy, completeness, consistency, and integrity for effective AI utilization.Identify, document AI quality issues, and recommend corrective measures.Monitor and analyze AI quality KPIs, supporting remediation initiatives.Maintain AI quality dashboards and reports to communicate trends to stakeholders.Engage in cross-functional meetings to align AI governance priorities.Enhance AI business glossaries, data dictionaries, and catalogs.Collaborate with AI/Data owners, stewards, and custodians to promote governance best practices.Assist with compliance-related AI & Data initiatives (e.g., EU AI Act, GDPR, AML, Responsible Gaming reporting).Evaluate and support AI/Data governance tools (e.g., Atlan, Collibra, Informatica, Microsoft Purview).Contribute to the continuous evolution of the AI governance framework and best practices.
Join Bioiatriki as a Data Entry Clerk, where you will play a crucial role in ensuring the accuracy and efficiency of our data management systems. This position requires a keen eye for detail and the ability to manage large volumes of information with precision.Your responsibilities will include entering and updating data in our systems, verifying data accuracy, and assisting in administrative tasks as needed. If you have strong organizational skills and are comfortable working with data, we want to hear from you!
About Us: At tbibankgr, we are a dynamic challenger bank situated in Southeast Europe, recognized as a regional leader in alternative payment solutions. Our innovative ecosystem merges financing and shopping to meet the diverse needs of our customers. With a successful and profitable business model, we are proud to serve clients in Romania, Germany, Bulgaria, and Lithuania.In our quest to expand our global presence, we have established a significant footprint in Greece, collaborating with thousands of merchants and consumers. Are you ready to contribute to our unique success story?Your Role: We are seeking a Senior Data Analyst to join our dedicated team in Greece!Key Responsibilities:Conduct data discovery to identify insights.Collaborate with partners to explore potential data sources and develop data interpretation logic.Acquire data from both primary and secondary sources.Analyze and interpret data, providing regular and ad-hoc reports and insights.Engage in the continuous development of our Data Warehouse (DWH) facilities.Work with extensive datasets from multiple sources to create cohesive data reports.Utilize data manipulation and transformation tools effectively.
Role overview d-one seeks a Senior Databricks Engineer in Athens, Greece. The focus is on developing and enhancing data solutions that help meet business objectives. Candidates should bring strong hands-on experience with Databricks and a solid track record in designing, implementing, and improving analytics infrastructure. What you will do Design and build data pipelines and solutions with Databricks Collaborate with cross-functional teams to gather requirements and turn them into technical deliverables Develop and maintain ETL processes Optimize data workflows for better performance and scalability Contribute to the ongoing development of the company’s data strategy Location This role is based in Athens, Attica, Greece.
Join Netcompany1 as a Lead Business Intelligence (BI) Engineer and take charge of designing and implementing robust BI solutions. You will lead a team of BI developers and analysts, ensuring high-quality data analysis that drives decision-making across the organization. This role requires a blend of technical expertise and leadership skills to manage BI projects effectively.
Elevate Your Career with Bally’s Intralot as a Business Intelligence & Reporting Engineer!We invite you to become part of our dynamic team at Bally’s Intralot, where you will play a pivotal role in revolutionizing the gaming industry. As a Business Intelligence & Reporting Engineer, your expertise will directly influence the development and implementation of essential reporting solutions that enhance operational efficiency and empower data-driven decisions across our international teams.Key Responsibilities:Create and maintain efficient data pipelines for seamless extraction, transformation, and loading of information from diverse data sources.Compile and manage extensive, complex datasets that fulfill both functional and non-functional business requirements.Identify, design, and implement improvements to internal processes, including automation of manual tasks, optimization of data delivery, and infrastructure redesign for enhanced scalability.Develop operational and non-operational reports utilizing SQL to meet evolving business demands.Qualifications:Bachelor’s degree in Computer Science, Software Engineering, or a related technical field.Minimum of 2 years of professional experience in designing and developing automated reporting solutions.Proficient in relational databases such as MySQL and MS SQL Server.Familiarity with reporting design tools, including Jasper Reports, Power BI, and Tableau.Experience with Databricks is a plus.Excellent command of both written and verbal English and Greek.