You will be involved in the design and development of Intel industry-leading Big Data product built around the Apache Hadoop ecosystem. You are not only expected to gain some understanding of specific areas of the Hadoop stack (such as HDFS, Hive, HBase, etc), but to be able to understand the challenges and intricacies of deploying, monitoring, managing and optimizing really large scale distributed data systems such as Hadoop.
The ideal candidate should have a strong sense of responsibility and commitment as well as the ability to work in a fast-paced agile development environment. He/she should also have quick learning skills and strong problem solving skills. The Software Engineer intern will join a high talented solution development team and participate in all phases of development - including technical design, implementation, and quality assurance.
Responsibilities:
o Provide solution consultant, PoC development and customer support based on Intel’s big data products.
o Define the overall solution architecture based on the customer’s system requirement.
o Design and implement the big data reference architecture based for domain solution.
o Performance tuning for the solution and application.
o Research the new technologies for big data solution.
o Other duties as assigned by management.
Requirements:
o Bachelor or master candidate in Computer Science or Software Engineering.
o Familiarity with of Hadoop related technology.
o Proficient in JAVA programming.
o Good communication skills. Proficiency in spoken and written English is preferred.
o Familiarity with computer networks is preferred.
o Knowledge of distributed computing is preferred.
o Knowledge of data mining algorithms is preferred.
o Experience with database related technology is a plus.
o At least 4 days / week. Total internship will be 6 - 12 months
o Some travel are needed
o Location is Beijing
Please send you resume to jianwei.li@intel.com
--
修改:leejianwei FROM 134.134.139.*
FROM 134.134.139.*