The difference between the programming of 20 years ago is that of today can be explained by a simple example.
Imagine two tailors who must make the same dress.
The first tailor does not want to use unnecessary fabrics therefore marks with various chalk pieces to be cut out so that they combine with each other. So there will be less waste and less cloth will be consumed. This takes a lot more time to realize. Obviously there must be no mistakes in cutting because there is not enough cloth to cut the same piece, so cuts will be made more slowly and with more attention.
The second tailor wants to go faster. So buy more fabric in case of hurrying mistakes a few cut points as well each piece of fabric gets fewer pieces because it must go in a hurry. But surely this entails a greater expense of raw material. But less production times also more mistake, more cloth, more means you can correct any mistakes in cuts that may be going in a hurry.
20 years ago there were less powerful machines, less fast processors, less powerful hardware, ram memories and smaller hard drives. Developing an app took more time because the algorithm is studying how to reduce size and increase speed.
Today if a too heavy application I suggest you change the hardware ... And every software or operating system update requires ever more hardware resources