Cross Entropy Loss in PyTorch
Ben Cook • Posted 2020-07-24 • Last updated 2021-10-14Cross entropy loss in PyTorch can be a little confusing. Here is a simple explanation of how it works for people who get stuck.
Cross entropy loss in PyTorch can be a little confusing. Here is a simple explanation of how it works for people who get stuck.
You can do vectorized pairwise distance calculations in NumPy (without using SciPy). This lets you extend pairwise computations to other kinds of functions.
You can use youtube-dl to download arbitrary YouTube videos from the command line. This is useful for doing machine learning on new videos.
Building machine learning pipelines as well-formed Python packages simplifies transfer learning. Here’s a simple example.
The object spread operator is a useful pattern in JavaScript. You can do something similar in Python.
How you structure code in an ML pipeline makes a big difference in whether other people can easily use it. Here’s a recommendation for how to do it well.
The latest version of Anaconda comes with Python 3.8. But sometimes you need to use an earlier release.
Adding a dimension to a tensor can be important when you’re building deep learning models.
SNS is AWS’s pub-sub service. It’s useful for sending and receiving alerts for events you care about. It can also be used to send SMS messages.
The IPython shell is a fast way to evaluate small bits of code. It also functions as a mighty fine calculator.