Is there a reliable resource/website that calculates which key sizes are currently at risk, based on how fast the newest quantum computers are?
As other answers have conveyed, if a given algorithm is susceptible to attack by quantum computers, it's not really a question of going to a larger key length; it wouldn't take much technological advancement to bring that larger key length under threat (and you never really know what the current state of the art is). We've seen from the history of classical computers (e.g. Moore's Law) that once you pass some basic threshold, exponential improvements are possible.
What other answers haven't mentioned is timeliness. Yes, you could ask "based on our current state of technology, is a particular algorithm & key length combination secure?", but that is only an instantaneous security. Sometimes that's good enough. If you want to agree a clandestine meeting with someone tomorrow, and so long as nobody finds out about it until after the fact, that's fine, you can use any algorithm that gave a yes answer to the question. However, what if that information is to remain secret for longer? Perhaps you're emailing someone the identity of an under-cover agent they are to meet. It's not good enough that the identity of that individual is protected now, but it must also be protected going into the future. Any data like that, you essentially have to assume that if it has been encrypted with an algorithm that is potentially susceptible to attack by a quantum computer, it will be read at some point, and is therefore compromised. Actually, if you're super-paranoid, you should assume this about all crypto algorithms anyway because even if the theory says they're perfectly secure, their practical implementation may be faulty and susceptible to cracking.
Or possibly, will new algorithms be created which try to prevent quantum computers from being able to crack them easily?
To replace these potentially breakable systems, you need new methods, which generally come under the banner of post-quantum crypto. Some of these exist already, but there are varying levels of confidence about how well they will actually hold up to attack. Much like with factoring numbers on a classical computer, where the difficulty was essentially based on "lots of people have tried, and nobody's succeeded in doing it efficiently, so we guess it isn't possible", the argument is similar, but not so many people have tried, and not for so long, as to have a huge weight of confidence yet, although the aim is to back it up with a bit more rigour from CS, making connections to complexity classes, and particularly the assumption P$\neq$NP.