Database dev project
$30-250 USD
Płatność przy odbiorze
Loading large datasets (> 10 GB) can be very slow, especially when there are many indexes on tables. One simple trick is to sort data before inserting it, or to drop indexes, insert data, and reload indexes. The goal of this project would be to investigate other techniques to improve load performance in an open source database such as Postgres or MySQL/Innodb. One concrete idea might be "fast bulk load" a la Hbase for MySQL. HBase uses a MapReduce job to build its indexes. One possible project would be to build a prototype that does the same for a single (or multiple) MySQL Boxes.
Numer ID Projektu: #11960554
O projekcie
7 freelancerów złożyło ofertę za $200 w tym projekcie
Hi I am a Full-time PHP Developer with Native English Skills. I have 8 years of experience in web applications and can do all-round development. Technical Details About Me: Strong PHP/Mysql background with s Więcej
Dear sir, I have working experience in developing Machine Learning application using Java, weka, spark and Hadoop. I also have working experience in developing application on AWS. Please feel free to Więcej
Hi, this is Vikash. I have been into Hadoop and Big Data since past 3 years. I have been work on - 1. Map Reduce 2. Hive 3. Pig 4. HDFS I have done tasks like - 1. Sentimental Data Analysis 2. Twitter Log Ana Więcej