

Depends on the ‘average’. Mean? Yes; median? probably not


Depends on the ‘average’. Mean? Yes; median? probably not
A single portal into all the different rooms might work if you are ok with your continuous room-choosing mechanism having a probability of zero to get you into your chosen room. Not a problem as long as you take everying with you since the probability of hitting some empty room is of course still one.
Yes, since I have hands to feel the door. Hands which, incidentally, are able to help me count things they touch :)
How would anyone get into one of those free, uncountable rooms if they can’t see them.
If you can see doors to enter each room, then they are countable.
Aaron was about as much a cofounder of reddit as Musk is a cofounder of tesla. Source: I was on reddit when he joined and I remember his own yc-startup Infogami.


Actually I agree. I guess I was just still annoyed after reading just previously about how llms are somehow not neural networks, and in fact not machine learning at all…
Btw, you can absolutely finetune llms on classical regression problems if you have the required data (and care more about prediction quality than statistical guarantees.) The resulting regressors are often quite good.


I will admit didn’t check because it was late and the article failed to load. I just remember reading several papers 1-2years ago on things like cancer-cell segmentation where the ‘classical’ UNet architecture was beaten by either pure transformers, or unets with added attention gates on all horizontal connections.


Those models will almost certainly be essentially the same transformer architecture as any of the llms use; simply because they beat most other architectures in almost any field people have tried them. An llm is, after all, just classifier with an unusually large set of classes (all possible tokens) which gets applied repeatedly


Because, while Switzerland is not part of the EU, it follows many of its regulations. Maybe even most of them.
In this particular case, I happen to know that the inofficial rule is indeed to have burner phones for travel into the us in some cases. But you’re never supposed to have unencrypted data on your phone or laptop in any case.
> binom.test(11,n=24, alternative = "two.sided")
Exact binomial test
data: 11 and 24
number of successes = 11, number of trials = 24, p-value = 0.8388
alternative hypothesis: true probability of success is not equal to 0.5
95 percent confidence interval:
0.2555302 0.6717919
sample estimates:
probability of success
0.4583333
Probably not. Or at least we can’t conclude that from the data. ¯\_(ツ)_/¯
I didn’t know that Rómendacil II was born with another name. Got the rest. :)
Logicians are mathematicians. Well, most of them are.
I have yet to meet a single logician, american or otherwise, who would use the definition without 0.
That said, it seems to depend on the field. I think I’ve had this discussion with a friend working in analysis.
But the vector space of (all) real functions is a completely different beast from the space of computable functions on finite-precision numbers. If you restrict the equality of these functions to their extension,
defined as f = g iff forall x\in R: f(x)=g(x),
then that vector space appears to be not only finite dimensional, but in fact finite. Otherwise you probably get a countably infinite dimensional vector space indexed by lambda terms (or whatever formalism you prefer.) But nothing like the space which contains vectors like
F_{x_0}(x) := (1 if x = x_0; 0 otherwise)
where x_0 is uncomputable.
Functions from the reals to the reals are an example of a vector space with elements which can not be represented as a list of numbers.
It may have nothing to do with categorization, but has everything to do with categorification which is much more interresting anyway.
Mir hei zum glück üses jährleche alpweekend vo letscht Wuche no chönne uf Disi schiebe. :)
Di letschte drü Täg isch ja eigentlech sehr guet gsi, und ih dene paar Stund wo gwitteret hett, heimer zumindescht äh sehr schöni Ussicht gha.
Mit em Dampfschiff übere See “Lue Tini, hesch das Schloss dert gseh?” D Muetter fragt, i bi drüjährig gsy “Lue Muetter, ds Schloss isch gäng no hie!”
Oberhofe am Thunersee! I dänke zrügg, s′ tuet guet, s tuet weh “Oberhofe!” mir lege aa U i der Täsche vo der Muetter het’s Schoggola!
With the possible exception of some small stations, everyone presenting weather predictions around here is a meteorologist. ¯\_(ツ)_/¯