Find Jobs
Hire Freelancers

Integrate our big data infrastructure in the Amazon web services cloud

$12-25 USD / hour

Closed
Posted over 6 years ago

$12-25 USD / hour

We are seeking a developer to partner with other engineering teams to help architect and build the data pipeline that ingest hundreds of billions of data points for our Field Analytics Platform utilizing AWS. Expand capability using various open source data processing technologies like Hadoop, Kafka, Spark, Cassandra and Neo4J into our infrastructure. Become an expert of AWS services that we leverage. Help to efficiently integrate our big data infrastructure in the AWS cloud. Build services, deploy models, algorithms, perform model training and provide tools to make our infrastructure more accessible to all our data scientists. Enable specific initiatives to build our capabilities from environmental classification to In-Season Field Analytics and more. Requires a degree or 15+ years experience in field or related area.
Project ID: 15937640

About the project

12 proposals
Remote project
Active 6 yrs ago

Looking to make some money?

Benefits of bidding on Freelancer

Set your budget and timeframe
Get paid for your work
Outline your proposal
It's free to sign up and bid on jobs
12 freelancers are bidding on average $21 USD/hour for this job
User Avatar
yes i am having good knowledge in AWS architecture with various api's...lets discuss more about current setup and needed structure for doing further analysis...
$25 USD in 40 days
4.9 (16 reviews)
6.1
6.1
User Avatar
Hi, I have more than 3+ years of experience in hadoop technologies please contact me for more details
$20 USD in 40 days
4.8 (13 reviews)
4.6
4.6
User Avatar
I am very strong in aws with bigdata implementation. I have done end to end implementation of hadoop ecosystem on aws ec2. strong in spark, kafka, hadoop, hive dwh, new technology as well....
$20 USD in 40 days
5.0 (7 reviews)
3.9
3.9
User Avatar
hello, I have 2 years of experience in big data technologies. I have done many projects on Hadoop, Spark, Flink, Kafka, Strom, R, machine learning, etc. I currently work as Big data administrator and developer.
$22 USD in 20 days
4.8 (8 reviews)
3.3
3.3
User Avatar
Certified HADOOP ADMIN with expertise in Expert in Hadoop Ecosystem, Apache hadoop, HDFS, Hive,Ambari,HortonWorks, Hbase, Sql, Cloudera,MongoDB,Docker, Kafka, Storm, AWS,Linux, mysql, Nagios,Ganglia, Titan,Graph Database,Hbase,Solr,Kerberos. Will give FREE demo and end to end solutioning with FREE troubleshooting.
$12 USD in 40 days
0.0 (0 reviews)
0.0
0.0
User Avatar
Iam a AWS Engineer having 2.8 yrs of IT experience with working experience on Devops tools as well including openshift automation platform(Dockery, Kubernetes ) Relevant Skills and Experience Version control as GIT, CI/CD tool jenkins, configuration management tool CHEF
$20 USD in 30 days
0.0 (0 reviews)
0.0
0.0
User Avatar
We have extensive experience in working with global business advisory, ERP, HRMS and software publisher companies, we help customers with the in-depth knowledge and transparency on licensing solutions and technology advisory services that can deliver cost effective solutions and forward-thinking approach. Thanks to a highly skilled and flexible team, Online24x7 in is in the position of providing services in the following areas: • Microsoft Dynamics 365 • Ramco Implementation (ERP & HCM/HRMS) • PeoplesHR Implementation (HCM/HRMS) • IBM Products • DataMatics Implementation for Analytics and BI (Business Intelligence) tool • Inventory Management or Asset Management Tools • Brillio Implementation partner for E-commerce & SOW based work • BPO/KPO Services • AWS/Azure Cloud base services • Training & Development ( Amazon Web Services, Microsoft Azure, DevOps, Linux, Unix, Machine Learning, Big Data, Hadoop, Salesforce, Advance Excel, Cloud Computing, IT Infrastructure ) • Resources Pooling • Microsoft Licenses We have well experienced AWS, Cloud,DevOps consultants in our team. We are keen to associate with you.
$27 USD in 40 days
0.0 (0 reviews)
0.0
0.0
User Avatar
• 10 year of experience in IT industry, involving Digital Analytics and Data Warehousing and Database development. • Expert level skills and hands on experience on Hadoop, Hive, Pig, Sqoop, Netezza, SQL, Oracle PLSQL, ETL, Apache Spark and Python. • Experience in migration product platform from Netezza to Hadoop. • Worked on Hortonworks Distribution. • Experience with Amazon Web Service (AWS). • Design and Requirement Gathering, Development and Unit testing • Expertized in shell scripting and scheduling Jenkins.  Expertise in developing and implementing Hive, Pig scripts and Sqoop commands.  Worked with Infra team on cluster configuration and implementing it for various clients.  Good knowledge and understanding on Map Reduce framework and Yarn.
$20 USD in 40 days
0.0 (0 reviews)
0.0
0.0

About the client

Flag of INDIA
HYDERABAD, India
5.0
2
Payment method verified
Member since Feb 1, 2012

Client Verification

Thanks! We’ve emailed you a link to claim your free credit.
Something went wrong while sending your email. Please try again.
Registered Users Total Jobs Posted
Freelancer ® is a registered Trademark of Freelancer Technology Pty Limited (ACN 142 189 759)
Copyright © 2024 Freelancer Technology Pty Limited (ACN 142 189 759)
Loading preview
Permission granted for Geolocation.
Your login session has expired and you have been logged out. Please log in again.