[Close] 

BigData Architect

Company Name:
Valleysoft Inc
BS or MS degree in Computer Science or equivalent
8+ years of experience in software and architecture design and development of large scale, fault tolerant and highly scalable enterprise and web based applications
Solid experience in Core Java development and design and development of web services within large scale, fault tolerant, multi-threaded distributed systems.
3+ years of experience in the development of Hadoop APIs and MapReduce jobs for large scale data processing.
Strong background in Big Data technologies and the Hadoop ecosystem (i.e. Hive, Pig, Flume, Storm, etc.)
Experience working with NoSQL data stores like HBase, Cassandra, MongoDB, etc. and RDBMS such as Oracle DB and SQL Server.
Experience with BI, data analytics and MPP databases like Vertica.
Familiarity with cloud services like AWS, Rackspace, or HP Cloud.
Solid understanding and experience with extract, transform, load (ETL) methodologies in a multi-tiered stack, integrating with Big Data systems like Hadoop and Cassandra.
Hands-on experience working in Linux, Unix, Windows environments.
Must be a team player and enjoy working in a cooperative and collaborative team environment.
Passionate on learning new technologies and standards
Strong verbal and written communication skills
Experience working in onsite/offshore model
Responsibilities:
Architect, Design and develop web services for large scale, multi-threaded applications.
Hands on technical role; Contribute to all phases of the SDLC, including but not limited to analysis, architecture, design, implementation, and testing.
Developing TB per month scale web services predominantly in Java.
Strong SQL and NoSQL data store usage: querying, tuning, mapping, and operations.
Architecting, provisioning, tuning, and development of Hadoop or other distributed systems.
Design, Prototype and drive Enterprise solutions across multiple business domains.
Run technical forums across multiple business units and provide feedback and best practices.
Formulating MapReduce ETL processes and working with downstream data warehouses for analytics.
Responsible for architecture/design documentation and participate in design/code reviews
Work with cross functional teams including onsite and offshore and mentor teams

Don't Be Fooled

The fraudster will send a check to the victim who has accepted a job. The check can be for multiple reasons such as signing bonus, supplies, etc. The victim will be instructed to deposit the check and use the money for any of these reasons and then instructed to send the remaining funds to the fraudster. The check will bounce and the victim is left responsible.