Jan. 10th, 2019

juan_gandhi: (Default)
Борщевый пояс (Borscht Belt). Играет музыка, поют "тумбалалайкэ". Субтитры: "исполняется немецкая песня". (Ну да, "Хорст Вессель", например, или "О Танненбаум" 
juan_gandhi: (Default)
John Baez writes in his tweets (go ahead and look up, or, better, subscribe to his amazing tweets).

Do you know what "continuum hypothesis" is? It's about whether there is an intermediate size set between a countable (ℵ0), for example, natural numbers, and 2^countable (ℵ1). It's been proven over 50 years ago that neither the existence nor the non-existence follows from the axioms of Zermelo-Fraenkel. So, when mathematicians say that they base their absolutely strict and correct theorems on set theory (I don't believe them), we can always ask - which one?

Now the things got more serious.

Suppose you are a serious "machine learning data scientist", and you want to base your tea-leaves guesses on a solid math. That is, figure out the theory behind taking billions of pictures of cats and dogs and detecting cats on them (my former colleagues was focusing on figuring out whether he has a cat or a mouse, and figured that if the fur is uniform gray, the "algorithm" says it's a mouse. Do you have a Russian Blue?)

So, what we do, while "detecting", is a kind of data compression. It's closer to something like mapping, 2^N -> N.

Now, surprise. The feasibility of this operation, in general settings, is equivalent to having a finite number of intermediate sizes between ℵ0 and ℵ1.

Details are here: https://www.nature.com/articles/s42256-018-0002-3

Learnability can be undecidable

"The mathematical foundations of machine learning play a key role in the development of the field. They improve our understanding and provide tools for designing new learning paradigms. The advantages of mathematics, however, sometimes come with a cost. Gödel and Cohen showed, in a nutshell, that not everything is provable. Here we show that machine learning shares this fate. We describe simple scenarios where learnability cannot be proved nor refuted using the standard axioms of mathematics. Our proof is based on the fact the continuum hypothesis cannot be proved nor refuted. We show that, in some cases, a solution to the ‘estimating the maximum’ problem is equivalent to the continuum hypothesis. The main idea is to prove an equivalence between learnability and compression."

Profile

juan_gandhi: (Default)
Juan-Carlos Gandhi

May 2025

S M T W T F S
    1 2 3
456 7 8 9 10
11 121314151617
181920 21222324
25262728293031

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated May. 22nd, 2025 05:28 pm
Powered by Dreamwidth Studios