Newsletter Subject

Your Year on Kaggle: Most Memorable Community Stats from 2016

From

kaggle.com

Email Address

team@kaggle.com

Sent On

Thu, Dec 29, 2016 09:23 PM

Email Preheader Text

Happy New Year to all 765,000 of you Hello! As we enter a new year, we want to share and celebrate s

Happy New Year to all 765,000 of you [View this email in your browser] Hello! As we enter a new year, we want to share and celebrate some of your 2016 highlights in the best way we know how: through numbers. There are major machine learning trends, impressive achievements, and fun factoids that all add up to one amazing community. Enjoy! Cheers, The Kaggle Team Community Kaggle wouldn't be Kaggle without you. And in 2016 we welcomed well over 300,000 new users to our community from all over the world. Walter Reade (inversion) became the world's first #1 Grandmaster in Discussions under our new progression system. [He shares his journey to the top »] This is the highest tf-idf calculated from the words in our Twitter followers' bios. What's the word, you ask? Hint: it's not "#bigdata". The word is analytics. The one-millionth Kaggler is currently projected to register on September 9th, 2017. In other words, that special moment will happen at 1504915200 in Unix epoch time. We applaud the eighty-eight Kagglers who've achieved Grandmaster status in Competitions plus one Kernel Master, [ZFTurbo]. And conversation was good in 2016: nearly 50,000 discussion posts were shared including remembrances of the life and accomplishments of Lucas (Leustagos), [a data science hero] and [#1 Kaggler]. Competitions XGBoost dominated discussion of ML techniques in 2016. But Keras is ending the year strong! Already piquing community interest, how will newcomer LightGBM do in 2017? Kagglers broke new participation records in 2016. Over 5,500 competitors accepted the challenge to improve [Santander's Customer Satisfaction]. "Trust the numbers, trust the data." Our own Amanda Schierz ([Bluefool]) got major props for her March ML Mania predictions in a 5:02 minute [video by FiveThirtyEight »] We saw 1.92 times more Kaggle InClass competitions launched by professors in 2016 compared to last year. 21,304 high fives to the students who made a submission! This year over 60,000 Kagglers competed for $1.1M in prizes, jobs, and knowledge. Thirty-nine winning teams [shared their approaches] on No Free Hunch and 154,986 submissions were made to the Titanic Getting Started competition alone. Plus, the future of competitions is bright: we launched our [first Code Competition] in December. Kernels & Datasets R used to be the language of choice on Kaggle, but 2016 has seen Python emerge as a clear winner. Will Python maintain its constrictive grip in the coming year? Our open data platform isn't quite like the Billboard 200. But if it were, the dataset [How ISIS Uses Twitter] would be a top-10 chart-topper for an impressive 31 weeks. Kagglers have published seven Pokémon-related datasets which together claim nearly 9,000 downloads. [Gotta download 'em all! »] Kagglers upvoted one another's kernels 31,091 times and 12.8% of these kernels received 10+ upvotes. Way to go! Read about some of our favorites [here »] When it came to open datasets published in 2016, sports and games were the clear winners. From the [European Soccer Database] with its 326 kernels to [20 Years of Games from IGN], the numbers make us look forward to a fantastic, data-filled 2017. Copyright © 2016 Kaggle, All rights reserved. You are receiving this email because you are a Kaggle user, or you are on Kaggle's mailing list. Want to change how you receive these emails? You can [update your preferences] or [unsubscribe from this list]

Marketing emails from kaggle.com

View More
Sent On

29/05/2024

Sent On

23/05/2024

Sent On

10/05/2024

Sent On

01/05/2024

Sent On

30/04/2024

Sent On

26/04/2024

Email Content Statistics

Subscribe Now

Subject Line Length

Data shows that subject lines with 6 to 10 words generated 21 percent higher open rate.

Subscribe Now

Average in this category

Subscribe Now

Number of Words

The more words in the content, the more time the user will need to spend reading. Get straight to the point with catchy short phrases and interesting photos and graphics.

Subscribe Now

Average in this category

Subscribe Now

Number of Images

More images or large images might cause the email to load slower. Aim for a balance of words and images.

Subscribe Now

Average in this category

Subscribe Now

Time to Read

Longer reading time requires more attention and patience from users. Aim for short phrases and catchy keywords.

Subscribe Now

Average in this category

Subscribe Now

Predicted open rate

Subscribe Now

Spam Score

Spam score is determined by a large number of checks performed on the content of the email. For the best delivery results, it is advised to lower your spam score as much as possible.

Subscribe Now

Flesch reading score

Flesch reading score measures how complex a text is. The lower the score, the more difficult the text is to read. The Flesch readability score uses the average length of your sentences (measured by the number of words) and the average number of syllables per word in an equation to calculate the reading ease. Text with a very high Flesch reading ease score (about 100) is straightforward and easy to read, with short sentences and no words of more than two syllables. Usually, a reading ease score of 60-70 is considered acceptable/normal for web copy.

Subscribe Now

Technologies

What powers this email? Every email we receive is parsed to determine the sending ESP and any additional email technologies used.

Subscribe Now

Email Size (not include images)

Font Used

No. Font Name
Subscribe Now

Copyright © 2019–2024 SimilarMail.