Daily Shaarli
10/27/24
A repository for the most elegant and useful UNIX commands. Great commands can be shared, discussed and voted on to provide a comprehensive resource for working from the command-line
Many Linux users have experienced a lasting sense of accomplishment after composing a particularly clever command that achieves multiple actions in just one line or that manages to do in one line what usually takes 10 clicks and as many windows in a graphical user interface (GUI). Aside from being the stuff of legend, one-liners are great examples of why the terminal is considered to be such a powerful tool.
Have you noticed that Git is so integral to working with code that people hardly ever include it in their tech stack or on their CV at all? The assumption is you know it already, or at least enough to get by, but do you?
Git is a Version Control System (VCS). The ubiquitous technology that enables us to store, change, and collaborate on code with others.
With the advent of Llama 2, running strong LLMs locally has become more and more a reality. Its accuracy approaches OpenAI's GPT-3.5, which serves well for many use cases.
In this article, we will explore how we can use Llama2 for Topic Modeling without the need to pass every single document to the model. Instead, we are going to leverage BERTopic, a modular topic modeling technique that can use any LLM for fine-tuning topic representations.
An LLM is no black box but an ML model (based on Neural Networks) that predicts the ‘next’ token given a sequence of previously predicted tokens and input prompt.
How is it able to get the context of the input? Using multi-head attention helps in focusing on important words compared to other tokens in the input sentence. If you’re interested in mathematics, you can read the below blog.