127 private links
The oPhysics website is a collection of interactive physics simulations. It is a work in progress, and likely always will be. Content will be added as time allows.
About The Author
All of the content on this site was created by me, Tom Walsh. I retired after teaching high school physics for 27 years, and AP Physics for 25 years. Please click my name above to send me feedback about these simulations or suggestions for new simulations I could create.
Large Language Models (LLM) are on fire, capturing public attention by their ability to provide seemingly impressive completions to user prompts (NYT coverage). They are a delicate combination of a radically simplistic algorithm with massive amounts of data and computing power. They are trained by playing a guess-the-next-word game with itself over and over again. Each time, the model looks at a partial sentence and guesses the following word. If it makes it correctly, it will update its parameters to reinforce its confidence; otherwise, it will learn from the error and give a better guess next time.
Projects are fraught with uncertainty, so it is no surprise that the language and tools of probability are making their way into project management practice. A good example of this is the use of Monte Carlo methods to estimate project variables. Such tools enable the project manager to present estimates in terms of probabilities (e.g. there’s a 90% chance that a project will finish on time) rather than illusory certainties. Now, it often happens that we want to find the probability of an event occurring given that another event has occurred. For example, one might want to find the probability that a project will finish on time given that a major scope change has already occurred. Such conditional probabilities, as they are referred to in statistics, can be evaluated using Bayes Theorem. This post is a discussion of Bayes Theorem using an example from project management.
Addressing the need for novel insect observation and control tools, the Photonic Fence detects and tracks mosquitoes and other flying insects and can apply lethal doses of laser light to them. Previously, we determined lethal exposure levels for a variety of lasers and pulse conditions on anesthetized Anopheles stephensi mosquitoes. In this work, similar studies were performed while the subjects were freely flying within transparent cages two meters from the optical system; a proof-of-principle demonstration of a 30 m system was also performed. From the dose–response curves of mortality data created as a function of various beam diameter, pulse width, and power conditions at visible and near-infrared wavelengths, the visible wavelengths required significantly lower laser exposure than near infrared wavelengths to disable subjects, though near infrared sources remain attractive given their cost and retina safety. The flight behavior of the subjects and the performance of the tracking system were found to have no impact on the mortality outcomes for pulse durations up to 25 ms, which appears to be the ideal duration to minimize required laser power. The results of this study affirm the practicality of using optical approaches to protect people and crops from pestilent flying insects.
We show that for thousands of years, humans have concentrated in a surprisingly narrow subset of Earth’s available climates, characterized by mean annual temperatures around ∼13 °C. This distribution likely reflects a human temperature niche related to fundamental constraints. We demonstrate that depending on scenarios of population growth and warming, over the coming 50 y, 1 to 3 billion people are projected to be left outside the climate conditions that have served humanity well over the past 6,000 y. Absent climate mitigation or migration, a substantial part of humanity will be exposed to mean annual temperatures warmer than nearly anywhere today.
All species have an environmental niche, and despite technological advances, humans are unlikely to be an exception. Here, we demonstrate that for millennia, human populations have resided in the same narrow part of the climatic envelope available on the globe, characterized by a major mode around ∼11 °C to 15 °C mean annual temperature (MAT). Supporting the fundamental nature of this temperature niche, current production of crops and livestock is largely limited to the same conditions, and the same optimum has been found for agricultural and nonagricultural economic output of countries through analyses of year-to-year variation. We show that in a business-as-usual climate change scenario, the geographical position of this temperature niche is projected to shift more over the coming 50 y than it has moved since 6000 BP. Populations will not simply track the shifting climate, as adaptation in situ may address some of the challenges, and many other factors affect decisions to migrate. Nevertheless, in the absence of migration, one third of the global population is projected to experience a MAT >29 °C currently found in only 0.8% of the Earth’s land surface, mostly concentrated in the Sahara. As the potentially most affected regions are among the poorest in the world, where adaptive capacity is low, enhancing human development in those areas should be a priority alongside climate mitigation.
CLI tool for exploring arXiv (inspired by karpathy's brilliant ArXiv Sanity Preserver)
The script will create data/pdf/, data/txt/ and data/summary/ directories to hold files downloaded from arXiv. I am also aware that this is a rather stupid way to implement a datastore but DBs seem a bit over the top. Text from PDFs are auto-converted on downloaded and are used to suggest future articles to the user. Downloading articles is idempotent.
Zenodo is a free and open digital archive built by CERN and OpenAIRE, enabling researchers to share and preserve research output in any size, format and from all fields of research.
Deep Learning has shown very promising results in the field of Computer Vision. But when applying it to practical domains such as medical imaging, lack of labeled data is a major challenge.
In practical settings, labeling data is a time consuming and expensive process. Though, you have a lot of images, only a small portion of them can be labeled due to resource constraints. In such settings, how can we leverage the remaining unlabeled images along with the labeled images to improve the performance of our model? The answer is semi-supervised learning.
FixMatch is a recent semi-supervised approach by Sohn et al. from Google Brain that improved the state of the art in semi-supervised learning(SSL). It is a simpler combination of previous methods such as UDA and ReMixMatch. In this post, we will understand the concept of FixMatch and also see how it got 78% median accuracy and 84% maximum accuracy on CIFAR-10 with just 10 labeled images.
A few weeks ago I wrote about Kuhn’s theory of paradigm shifts and how it relates to Bayesian inference. In this post I want to back up a little bit and explain what Bayesian inference is, and eventually rediscover the idea of a paradigm shift just from understanding how Bayesian inference works.
Facebook AI has developed the first neural network that uses symbolic reasoning to solve advanced mathematics problems.
Hot things glow red, hotter things yellow, and really hot things white. When you heat glass, it does not shine forth with an encouraging green or pale.
Long article dealing with the several aspects and impacts of climate change.
With many references to papers and multimedia content to deepen the concepts.
Over the past year, I’ve worked on and off documentation for WordPress. I started contributing during a freeze around launch to help developers transition to the new platform. I found writing documentation is something I enjoy, plus rewarding to help and educate people. Though it’s not a primary part of my job, I’ve continued to find time here and there to keep contributing.
In this time, I’ve read various resources on technical writing and documentation. These are my notes, both to help me remember later, but also as a tool to help me think about writing now.
Detection and attribution typically aims to find long-term climate signals in internal, often short-term variability. Here, common methods are extended to high-frequency temperature and humidity data, detecting instantaneous, global-scale climate change since 1999 for any year and 2012 for any day.
Ryohei Hisano and Didier Sornette wrote in 2012 a paper titled, “On the distribution of time-to-proof of mathematical conjectures.”
Today Ken and I discuss predicting the end to mathematical conjectures. This is apart from considering odds on which way they will go, which we also talk about.
Nine years ago the Christmas issue of the New Scientist magazine analyzed a small set of solved mathematical conjectures and used it to forecast when the P vs. NP conjecture would be solved. The article estimated that the “probability for the P vs. NP problem to be solved by the year 2024 is roughly 50%”.
Federal public comment websites currently are unable to detect Deepfake Text once submitted. I created a computer program (a bot) that generated and submitted 1,001 deepfake comments regarding a Medicaid reform waiver to a federal public comment website, stopping submission when these comments comprised more than half of all submitted comments. I then formally withdrew the bot comments.
Mathematicians regard the Collatz conjecture as a quagmire and warn each other to stay away. But now Terence Tao has made more progress than anyone in decades.
Recently, I am learning how Elliptic Curve Cryptography works. I searched around the internet, found so many articles and videos explaining it. Most of them are covering only a portion of it, some of them skip many critical steps how you get from here to there. In the end, I didn’t find an article that really explains it from end-to-end in an intuitive way.
With that in mind, I would like to write a post explaining Elliptic Curve Cryptography, cover from the basics to key exchange, encryption, and decryption.
The Dataverse Project - Dataverse.org