Real life code talk between two working developers.
…
continue reading
Join Sophia V. Prater on a dive into the weeds on UX systems, information architecture, human psychology, and simplifying the complex. Support this podcast: https://podcasters.spotify.com/pod/show/ooux/support
…
continue reading
The world of computer programming is vast in scope. There are literally thousands of topics to cover and no one person could ever reach them all. One of the goals of the Coding Blocks podcast is to introduce a number of these topics to the audience so they can learn during their commute or while cutting the grass. We will cover topics such as best programming practices, design patterns, coding for performance, object oriented coding, database design and implementation, tips, tricks and a who ...
…
continue reading
The Smalltalk programming language is not only the first pure object oriented language, but has been the birthplace for many of today's best practices in software development including agile methods, design patterns, unit testing, refactoring and incremental development. In the Smalltalk Reflections podcast, David Buck and Craig Latta guide you through the world of Smalltalk covering topics from basic object oriented principles to software testing, version control and deployment techniques. ...
…
continue reading
This is a feed of pages for iversen1450149
…
continue reading
A podcast hosted by Kathryn Hodge and Robyn Silber that aims to introduce others to coding, demystify core programming concepts, provide tips for success, and start more conversations around important topics in technology.
…
continue reading
Welcome to "The AI Chronicles", the podcast that takes you on a journey into the fascinating world of Artificial Intelligence (AI), AGI, GPT-5, GPT-4, Deep Learning, and Machine Learning. In this era of rapid technological advancement, AI has emerged as a transformative force, revolutionizing industries and shaping the way we interact with technology. I'm your host, GPT-5, and I invite you to join me as we delve into the cutting-edge developments, breakthroughs, and ethical implications of A ...
…
continue reading
The moths of programming are ideas in the software engineering industry that have become pervasive due to virality rather than reality.
…
continue reading
Not your average fitspo – talking all things fitness, self growth, and business with an authentic yet authoritative approach.
…
continue reading
As it concerns the racial history of our country, are the objects in the mirror closer than they appear or not? Objects In The Mirror podcast asks this question as listeners hear firsthand accounts of those who lived during the segregation and early desegregation eras.
…
continue reading
24 Minutes of UX is a podcast of the people, by the people, for the people. The format is a combination of an after-work discussion, a 1-on-1 chat, a global community platform, and a coaching session. Episodes typically feature a Seeker and a Giver. The Seeker is a person seeking advice on a specific topic within the domain of UX. The Giver is here for sharing experiences about “what I wish I had known when I started with this topic”. Both the Seeker and the Giver can be senior or junior - w ...
…
continue reading
1
Scala: A Modern Language for Functional and Object-Oriented Programming
5:51
5:51
Play later
Play later
Lists
Like
Liked
5:51
Scala, short for "scalable language," is a powerful programming language that merges the best features of both object-oriented and functional programming paradigms. Designed to be concise, elegant, and expressive, Scala offers a robust framework for developers to build scalable and maintainable software solutions. Combining Paradigms One of Scala's…
…
continue reading
1
RL4J: Empowering Reinforcement Learning in Java
4:40
4:40
Play later
Play later
Lists
Like
Liked
4:40
RL4J is a powerful open-source library designed for reinforcement learning (RL) applications within the Java ecosystem. Developed as part of the Deeplearning4j project, RL4J aims to provide developers and researchers with robust tools to implement and experiment with various reinforcement learning algorithms. As machine learning continues to expand…
…
continue reading
1
Arbiter: Streamlining Optimization and Hyperparameter Tuning for Machine Learning Models
4:20
4:20
Play later
Play later
Lists
Like
Liked
4:20
Arbiter is an advanced tool designed to enhance the process of optimization and hyperparameter tuning in machine learning models. As machine learning continues to evolve, the importance of fine-tuning model parameters to achieve optimal performance has become increasingly critical. Key Features of Arbiter Automated Hyperparameter Tuning: Arbiter au…
…
continue reading
1
Hypothesis Testing: A Guide to Z-Test, T-Test, and ANOVA
49:12
49:12
Play later
Play later
Lists
Like
Liked
49:12
Hypothesis testing is a fundamental method in statistics used to make inferences about a population based on sample data. It provides a structured approach to evaluate whether observed data deviates significantly from what is expected under a specific hypothesis. Three commonly used hypothesis tests are the Z-test, T-test, and ANOVA, each serving d…
…
continue reading
1
Non-parametric Tests: Flexible Tools for Statistical Analysis
5:12
5:12
Play later
Play later
Lists
Like
Liked
5:12
Non-parametric tests are a class of statistical methods that do not rely on assumptions about the underlying distribution of data. Unlike parametric tests, which assume a specific distribution for the data, non-parametric tests are more flexible and can be applied to a wider range of data types. This makes them particularly useful in situations whe…
…
continue reading
Darkmode VSOPBy Daniel Coulbourne & Caleb Porzio
…
continue reading
1
General Linear Model (GLM): A Versatile Framework for Data Analysis
3:55
3:55
Play later
Play later
Lists
Like
Liked
3:55
The General Linear Model (GLM) is a foundational framework in statistical analysis, widely used for modeling and understanding relationships between variables. It offers a flexible and comprehensive approach for analyzing data by encompassing various types of linear relationships and can be applied across numerous fields including economics, social…
…
continue reading
1
Statistical Models: Frameworks for Understanding and Predicting Data
4:42
4:42
Play later
Play later
Lists
Like
Liked
4:42
Statistical models are powerful tools that allow us to understand, describe, and predict patterns in data. These models provide a structured way to capture the underlying relationships between variables, enabling us to make informed decisions, test hypotheses, and generate predictions about future outcomes. Whether in science, economics, medicine, …
…
continue reading
1
Point and Interval Estimation: Tools for Accurate Statistical Inference
5:36
5:36
Play later
Play later
Lists
Like
Liked
5:36
Point and interval estimation are key concepts in statistics that provide methods for estimating population parameters based on sample data. These techniques are fundamental to making informed decisions and predictions in various fields, from science and engineering to economics and public policy. By offering both specific values and ranges of plau…
…
continue reading
1
P-values and Confidence Intervals: Essential Tools for Statistical Decision-Making
3:52
3:52
Play later
Play later
Lists
Like
Liked
3:52
P-values and confidence intervals are fundamental concepts in statistical analysis, providing critical insights into the reliability and significance of data findings. These tools help researchers, scientists, and analysts make informed decisions based on sample data, enabling them to draw conclusions about broader populations with a known level of…
…
continue reading
1
ImageNet: Revolutionizing Computer Vision and Deep Learning
37:22
37:22
Play later
Play later
Lists
Like
Liked
37:22
ImageNet is a large-scale visual database designed for use in visual object recognition research, and it has played a pivotal role in advancing the field of computer vision and deep learning. Launched in 2009 by researchers at Princeton and Stanford, ImageNet consists of millions of labeled images categorized into thousands of object classes, makin…
…
continue reading
1
Bayesian Inference and Posterior Distributions: A Dynamic Approach to Statistical Analysis
4:36
4:36
Play later
Play later
Lists
Like
Liked
4:36
Bayesian inference is a powerful statistical method that provides a framework for updating our beliefs in light of new evidence. Rooted in Bayes' theorem, this approach allows us to combine prior knowledge with new data to form updated, or posterior, distributions, which offer a more nuanced and flexible understanding of the parameters we are study…
…
continue reading
1
Statistical Inference: Drawing Conclusions from Data
5:45
5:45
Play later
Play later
Lists
Like
Liked
5:45
Statistical inference is a critical branch of statistics that involves making predictions, estimates, or decisions about a population based on a sample of data. It serves as the bridge between raw data and meaningful insights, allowing researchers, analysts, and decision-makers to draw conclusions that extend beyond the immediate data at hand. Core…
…
continue reading
1
Sampling Techniques: Ensuring Representativeness in Data Collection
3:33
3:33
Play later
Play later
Lists
Like
Liked
3:33
Sampling techniques are crucial methods used in statistics to select a subset of individuals or observations from a larger population. These techniques allow researchers to gather data efficiently while ensuring that the sample accurately reflects the characteristics of the entire population. Among the most widely used sampling methods are random s…
…
continue reading
1
Sampling Distributions: The Bridge Between Sample Data and Population Insights
5:42
5:42
Play later
Play later
Lists
Like
Liked
5:42
Sampling distributions are a fundamental concept in statistics that play a crucial role in understanding how sample data relates to the broader population. When we collect data from a sample, we often want to make inferences about the entire population from which the sample was drawn. However, individual samples can vary, leading to differences bet…
…
continue reading
1
Central Limit Theorem (CLT): The Pillar of Statistical Inference
5:50
5:50
Play later
Play later
Lists
Like
Liked
5:50
The Central Limit Theorem (CLT) is one of the most important and foundational concepts in statistics. It provides a crucial link between probability theory and statistical inference, enabling statisticians and researchers to draw reliable conclusions about a population based on sample data. The CLT states that, under certain conditions, the distrib…
…
continue reading
1
Sampling and Distributions: The Cornerstones of Statistical Analysis
3:31
3:31
Play later
Play later
Lists
Like
Liked
3:31
Sampling and distributions are fundamental concepts in statistics that play a crucial role in analyzing and understanding data. They form the backbone of statistical inference, enabling researchers to draw conclusions about a population based on a smaller, manageable subset of data. By understanding how samples relate to distributions, statistician…
…
continue reading
1
Kernel Density Estimation (KDE): A Powerful Technique for Understanding Data Distributions
3:41
3:41
Play later
Play later
Lists
Like
Liked
3:41
Kernel Density Estimation (KDE) is a non-parametric method used in statistics to estimate the probability density function of a random variable. Unlike traditional methods that rely on predefined distributions, KDE provides a flexible way to model the underlying distribution of data without making strong assumptions. This makes KDE a versatile and …
…
continue reading
1
Distribution-Free Tests: Flexible Approaches to Hypothesis Testing Without Assumptions
3:02
3:02
Play later
Play later
Lists
Like
Liked
3:02
Distribution-free tests, also known as non-parametric tests, are statistical methods used for hypothesis testing that do not rely on any assumptions about the underlying distribution of the data. Unlike parametric tests, which assume that data follows a specific distribution (such as the normal distribution), distribution-free tests offer a more fl…
…
continue reading
1
Non-Parametric Statistics: Flexible Tools for Analyzing Data Without Assumptions
3:58
3:58
Play later
Play later
Lists
Like
Liked
3:58
Non-parametric statistics is a branch of statistics that offers powerful tools for analyzing data without the need for making assumptions about the underlying distribution of the data. Unlike parametric methods, which require the data to follow a specific distribution (such as the normal distribution), non-parametric methods are more flexible and c…
…
continue reading
1
Factor Analysis (FA): Unveiling Hidden Structures in Complex Data
4:00
4:00
Play later
Play later
Lists
Like
Liked
4:00
Factor Analysis (FA) is a statistical method used to identify underlying relationships between observed variables. By reducing a large set of variables into a smaller number of factors, FA helps to simplify data, uncover hidden patterns, and reveal the underlying structure of complex datasets. This technique is widely employed in fields such as psy…
…
continue reading
Kids have it the worst, dude.By Daniel Coulbourne & Caleb Porzio
…
continue reading
1
Probability Distributions: A Fundamental Tool for Understanding Uncertainty
3:45
3:45
Play later
Play later
Lists
Like
Liked
3:45
Probability distributions are essential concepts in statistics and probability theory, providing a way to describe how probabilities are spread across different outcomes of a random event. They are the foundation for analyzing and interpreting data in various fields, enabling us to understand the likelihood of different outcomes, assess risks, and …
…
continue reading
1
Probability Distributions: Mapping the Likelihood of Outcomes
4:30
4:30
Play later
Play later
Lists
Like
Liked
4:30
Probability distributions are fundamental concepts in statistics and probability theory that describe how the probabilities of different possible outcomes are distributed across a range of values. By providing a mathematical description of the likelihood of various outcomes, probability distributions serve as the backbone for understanding and anal…
…
continue reading
1
Things to Know when Considering Multi-Tenant or Multi-Threaded Applications
1:58:45
1:58:45
Play later
Play later
Lists
Like
Liked
1:58:45
For the full show notes head over to: https://www.codingblocks.net/episode241By Michael Outlaw, Allen Underwood, Joe Zack
…
continue reading
1
Probability Spaces: The Foundation of Modern Probability Theory
3:36
3:36
Play later
Play later
Lists
Like
Liked
3:36
Probability spaces form the fundamental framework within which probability theory operates. They provide a structured way to describe and analyze random events, offering a mathematical foundation for understanding uncertainty, risk, and randomness. By defining a space where all possible outcomes of an experiment or random process are considered, pr…
…
continue reading
1
Multivariate Statistics: Analyzing Complex Data with Multiple Variables
4:35
4:35
Play later
Play later
Lists
Like
Liked
4:35
Multivariate statistics is a branch of statistics that deals with the simultaneous observation and analysis of more than one statistical outcome variable. Unlike univariate or bivariate analysis, which focus on one or two variables at a time, multivariate statistics considers the interrelationships between multiple variables, providing a more compr…
…
continue reading
1
Graph Recurrent Networks (GRNs): Bridging Temporal Dynamics and Graph Structures
4:43
4:43
Play later
Play later
Lists
Like
Liked
4:43
Graph Recurrent Networks (GRNs) are an advanced type of neural network that combines the capabilities of recurrent neural networks (RNNs) with graph neural networks (GNNs) to model data that is both sequential and structured as graphs. GRNs are particularly powerful in scenarios where the data not only changes over time but is also interrelated in …
…
continue reading
1
Ruby: A Dynamic, Elegant Programming Language for Web Development and Beyond
3:44
3:44
Play later
Play later
Lists
Like
Liked
3:44
Ruby is a dynamic, open-source programming language known for its simplicity, elegance, and productivity. Created by Yukihiro "Matz" Matsumoto in the mid-1990s, Ruby was designed with the principle of making programming both enjoyable and efficient. The language’s intuitive syntax and flexibility make it a favorite among developers, especially for …
…
continue reading
1
Vue.js: The Progressive JavaScript Framework for Modern Web Applications
6:59
6:59
Play later
Play later
Lists
Like
Liked
6:59
Vue.js is an open-source JavaScript framework used for building user interfaces and single-page applications. Created by Evan You in 2014, Vue.js has quickly gained popularity among developers for its simplicity, flexibility, and powerful features. It is designed to be incrementally adoptable, meaning that it can be used for everything from enhanci…
…
continue reading
1
ReactJS: A Powerful Library for Building Dynamic User Interfaces
5:09
5:09
Play later
Play later
Lists
Like
Liked
5:09
ReactJS is a popular open-source JavaScript library used for building user interfaces, particularly single-page applications where a seamless user experience is key. Developed and maintained by Facebook, ReactJS has become a cornerstone of modern web development, enabling developers to create complex, interactive, and high-performance user interfac…
…
continue reading
1
Apache Spark: The Unified Analytics Engine for Big Data Processing
29:04
29:04
Play later
Play later
Lists
Like
Liked
29:04
Apache Spark is an open-source, distributed computing system designed for fast and flexible large-scale data processing. Originally developed at UC Berkeley’s AMPLab, Spark has become one of the most popular big data frameworks, known for its ability to process vast amounts of data quickly and efficiently. Spark provides a unified analytics engine …
…
continue reading
1
Clojure: A Dynamic, Functional Programming Language for the JVM
3:43
3:43
Play later
Play later
Lists
Like
Liked
3:43
Clojure is a modern, dynamic, and functional programming language that runs on the Java Virtual Machine (JVM). Created by Rich Hickey in 2007, Clojure is designed to be simple, expressive, and highly efficient for concurrent programming. It combines the powerful features of Lisp, a long-standing family of programming languages known for its flexibi…
…
continue reading
1
Caffe: A Deep Learning Framework for Speed and Modularity
3:18
3:18
Play later
Play later
Lists
Like
Liked
3:18
Caffe is an open-source deep learning framework developed by the Berkeley Vision and Learning Center (BVLC) and contributed to by a global community of researchers and engineers. Designed with an emphasis on speed, modularity, and ease of use, Caffe is particularly well-suited for developing and deploying deep learning models, especially in the fie…
…
continue reading
1
Nimfa: A Python Library for Non-negative Matrix Factorization
5:43
5:43
Play later
Play later
Lists
Like
Liked
5:43
Nimfa is a Python library specifically designed for performing Non-negative Matrix Factorization (NMF), a powerful technique used in data analysis to uncover hidden structures and patterns in non-negative data. Developed to be both flexible and easy to use, Nimfa provides a comprehensive set of tools for implementing various NMF algorithms, making …
…
continue reading
1
FastAPI: High-Performance Web Framework for Modern APIs
6:30
6:30
Play later
Play later
Lists
Like
Liked
6:30
FastAPI is a modern, open-source web framework for building APIs with Python. Created by Sebastián Ramírez, FastAPI is designed to provide high performance, easy-to-use features, and robust documentation. It leverages Python's type hints to offer automatic data validation and serialization, making it an excellent choice for developing RESTful APIs …
…
continue reading
1
NetBeans: A Comprehensive Integrated Development Environment
4:23
4:23
Play later
Play later
Lists
Like
Liked
4:23
NetBeans is a powerful, open-source integrated development environment (IDE) used by developers to create applications in various programming languages. Initially developed by Sun Microsystems and now maintained by the Apache Software Foundation, NetBeans provides a robust platform for building desktop, web, and mobile applications. It supports a w…
…
continue reading
1
Area Under the Curve (AUC): A Comprehensive Metric for Evaluating Classifier Performance
5:24
5:24
Play later
Play later
Lists
Like
Liked
5:24
The Area Under the Curve (AUC) is a widely used metric in the evaluation of binary classification models. It provides a single scalar value that summarizes the performance of a classifier across all possible threshold values, offering a clear and intuitive measure of how well the model distinguishes between positive and negative classes. The AUC is…
…
continue reading
1
Non-Negative Matrix Factorization (NMF): Uncovering Hidden Patterns in Data
7:43
7:43
Play later
Play later
Lists
Like
Liked
7:43
Non-Negative Matrix Factorization (NMF) is a powerful technique in the field of data analysis and machine learning used to reduce the dimensionality of data and uncover hidden patterns. Unlike other matrix factorization methods, NMF imposes the constraint that the matrix elements must be non-negative. This constraint makes NMF particularly useful f…
…
continue reading
Grab your headphones because it's water cooler time! In this episode we're catching up on feedback, putting our skills to the test, and wondering what we're missing. Plus, Allen's telling it how it is, Outlaw is putting it all together and Joe is minding the gaps! View the full show notes here: https://www.codingblocks.net/episode240 Reviews Thank …
…
continue reading
1
Signal Detection Theory (SDT): Understanding Decision-Making in the Presence of Uncertainty
4:51
4:51
Play later
Play later
Lists
Like
Liked
4:51
Signal Detection Theory (SDT) is a framework used to analyze and understand decision-making processes in situations where there is uncertainty. Originating in the fields of radar and telecommunications during World War II, SDT has since been applied across various domains, including psychology, neuroscience, medical diagnostics, and market research…
…
continue reading
1
AngularJS: Revolutionizing Web Development with Dynamic Applications
5:57
5:57
Play later
Play later
Lists
Like
Liked
5:57
AngularJS is a powerful JavaScript-based open-source front-end web framework developed by Google. Introduced in 2010, AngularJS was designed to simplify the development and testing of single-page applications (SPAs) by providing a robust framework for client-side model–view–controller (MVC) architecture. It has significantly transformed the way dev…
…
continue reading
1
Expectation-Maximization Algorithm (EM): A Powerful Tool for Data Analysis
4:24
4:24
Play later
Play later
Lists
Like
Liked
4:24
The Expectation-Maximization (EM) algorithm is a widely-used statistical technique for finding maximum likelihood estimates in the presence of latent variables. Developed by Arthur Dempster, Nan Laird, and Donald Rubin in 1977, the EM algorithm provides an iterative method to handle incomplete data or missing values, making it a cornerstone in fiel…
…
continue reading
1
Kotlin: A Modern Programming Language for the JVM and Beyond
7:05
7:05
Play later
Play later
Lists
Like
Liked
7:05
Kotlin is a contemporary programming language developed by JetBrains, designed to be fully interoperable with Java while offering a more concise and expressive syntax. Introduced in 2011 and officially released in 2016, Kotlin has rapidly gained popularity among developers for its modern features, safety enhancements, and seamless integration with …
…
continue reading
1
Dynamic Topic Models (DTM): Capturing the Evolution of Themes Over Time
5:00
5:00
Play later
Play later
Lists
Like
Liked
5:00
Dynamic Topic Models (DTM) are an advanced extension of topic modeling techniques designed to analyze and understand how topics in a collection of documents evolve over time. Developed to address the limitations of static topic models like Latent Dirichlet Allocation (LDA), DTMs allow researchers and analysts to track the progression and transforma…
…
continue reading
1
False Positive Rate (FPR): A Critical Metric for Evaluating Classification Accuracy
6:14
6:14
Play later
Play later
Lists
Like
Liked
6:14
The False Positive Rate (FPR) is a crucial metric used to evaluate the performance of binary classification models. It measures the proportion of negative instances that are incorrectly classified as positive by the model. Understanding FPR is essential for assessing how well a model distinguishes between classes, particularly in applications where…
…
continue reading
Things are in full swing. There are many things we cannot say. We are both slammed trying to get everything done for Laracon. Which led to this not being put out very quickly.By Daniel Coulbourne & Caleb Porzio
…
continue reading