It is. Look at the uncertainty principle.
So, entropy is a multi-faceted concept mainly because, it applies where we can see all the moving parts, and also in matter where there are of the order 10^23 parts and many more states.
The information- or Shannon-entropy is defined in terms of signal loss on transmission through a channel, and this has been proven to be equivalent to thermodynamic entropy. In that technical sense, information can be defined as the inverse of entropy. And that is typically what is meant by everything is information - although information is also used in other senses even by those same physicists, causing confusion (for example, constraints, symmetries). The closest to the big bang we can see had the greatest degree of certainty about the state of the most matter; that is an equivalent description to saying it was in a high entropy state. It must be noted entropy is always relative, it is only meaningful in comparing states.
Entropy was long treated as a secondary property of matter, or mass-energy. But with relativity being about information propagation as limited to the speed of light, and quantum mechanics having the uncertainty principle defining quantum behaviour, information transfer & uncertainty is clearly fundamental to the behaviour of physical systems. It can be argued that modern physics is property-dualist, between mass-energy & information, and the expected resolution of those into a single description of mass-energy + initial conditions to provide a complete description in classical physics, has now shifted to an expectation that mass-energy will be subsumed into a more fundamental information space description. Mass is now described as interaction with the Higgs field. That just leaves space-time & the effect of mass on it, to unify all the known fields.
The programme to quantise the classical theory of gravity has failed, but space-time in quantum field theory only appears as unexamined background. So interest is now shifting to how space-time might be emergent from a quantum description.
Chiribella has proposed the 'purification principle', that increase of entropy is equivalent to the spreading out of information, or mixing of states that need to be known for a complete description. The conservation of information is increasingly speculated to be a universal principle by Susskind & others, because that would resolve the blackhole information paradox. Blackholes seem to have the highest density of states possible, another way of saying maximum entropy, behaving as a kind of frictionless superfluid inside the event horizon. Time there behaves so strangely that it's likely observations will be needed to move forward, like through gravity-wave observatories.
There are many professional physicists who give the information-as-fundamental view short shrift, eg Sean Carroll. It is intuitively appealing, but can certainly be argued that it just restates what was always known, to do physics we want to compare states of systems at different times, so we gather & compare information, and contrast.