Manifold Blog

Manifold Blog

Custom Loss Functions for Gradient Boosting

Posted by Prince Grover on Sep 28, 2018 3:27:51 PM

By Prince Grover and Sourav Dey


Gradient boosting is widely used in industry and has won many Kaggle competitions. The internet already has many good explanations of gradient boosting (we've even shared some selected links in the references), but we've noticed a lack of information about custom loss functions: the why, when, and how. This post is our attempt to summarize the importance of custom loss functions in many real-world problems — and how to implement them with the LightGBM gradient boosting package.

Read More

Topics: Data science, Data engineering

A Python Toolkit for Docker-First Data Science

Posted by Alexander Ng on Apr 19, 2018 7:00:00 AM

As interest in Artificial Intelligence (AI), and specifically Machine Learning (ML), grows and more engineers enter this popular field, the lack of de facto standards and frameworks for how work should be done is becoming more apparent. A new focus on optimizing the ML delivery pipeline is starting to gain momentum.

Read More

Topics: Data engineering, MachOps, Orbyter

Never Miss a Post

Get the Manifold Blog in Your Inbox

We publish occasional blog posts about our client work, open source projects, and conference experiences. We focus on industry insights and practical takeaways to help you accelerate your data roadmap and create business value.

Subscribe Here

Recent Posts