To ask what makes a system intelligent almost begs the question 'in this context what do we mean by artificially intelligent?' which I think this what this question is really gearing towards.
From my studies, I've come to see that 'Artificial Intelligence' is a catchy term to use but perhaps misleading, and it conjures up images of these self-driving cars and robots that will take over the earth.
What I've found AI, and 'intelligent' systems moreso represent is an aid or a support that works for us, rather than one that works because of us... hear me out:
What makes the jump to an intelligent system for me is the step where the system begins to 'adapt / learn' or otherwise do things I didn't directly tell it to do. With the sundial, I measured and cut every inch of it by hand, and put it in a specific way to do a specific thing.
When a programmer gets into a car he automated, it may do some things he didn't directly program or maybe couldn't even expect (just one example: querying some database to see lots of people are driving somewhere, discovering a concert is going on there, and asking if the driver wants directions / tickets)
In conclusion, an intelligent system to me is one that we build in such a way that it educates and supports us, rather than a system we ourselves 'educate' to do a specific task. Supportive systems that elucidate and adapt and act 'rationally' even when we didn't tell it what 'rational' behaviour was.