Small executables load very fast into the memory and are right there when you open them. This helps for utilities such as GUI tools that you need to open up quickly.. It also helps with CGI programs that need to be forked open and shut down immediately. Or in a GUI tool that searches files - I'd want that GUI to open pretty quickly because I want to search files right away - I don't want to wait all day for the program to open before I search files. Files need to be searched spur of the moment. Or if I have a quick thing to do, such as unizip 40 files but the command line isn't good enough since those 40 files are in different directories and I don't know exactly which ones - I hopefully can fire open fast GUI program (which has a small executable size so that it loads on my screen at my finger tips, right away) which let's me enter a few patterns in to open up 40 files based on a certain directory these files are located in.
So when I want to fire open a program and do these quick tasks, it's got to open fast or I lose patience and lose interest. Since the program has to open fast, we can determine that Speed is related to Size. The size of the program makes it load faster. Not just the speed of the algorithms stored in the program matter. A very bloated program could take 1 minute to load, but once loaded maybe the algorithms stored in the executable are fast. That's not the full story of speed, I'm afraid. Some of the most important daily tasks on computers are accomplished spur of the moment.. and having a nice program that can open up immediately is important for a hacker who needs to get things done spur of the moment. Some people use command line only, and are command line zealots because the command line can do spur of the moment tasks. But not always is the command line the best option since it doesn't offer clear visual feed back, structure, and organization like a GUI can offer if designed properly (I'm not talking about Windows Explorer junk programs or kludgy wizard dialog interfaces).
So what am I getting at? The size of an executable matters, because a small executable loads fast and gets things done while I'm in a spur of the moment thought. If I have to wait 10-15 seconds or sometimes 1 minute for my StarOffice program or my 6MB Lazarus produced executable to open, I'm probably going to choose some other tool like a fast text editor, depending on the situation. For example, many people will fire open a small text editor to write a quick note down instead of opening up Microsoft Word or Star Office, or Open Office. Many people will open up a quick and fast file manager when they just thought about something they needed to delete and copy. Hackers thoughts happen spur of the moment - and real hackers like fast loading programs where we can satisfy our spur of the moment needs. I'm not talking about the typical dibshit who uses a computer here - for them, size doesn't matter - because their thoughts aren't spur of the moment and they aren't getting things done like a hacker does at rapid pace.
But then there are the nonsensical folks out there who say that size doesn't matter.. such as those who continually misunderstand why the size of Lazarus exe's is important. It is not always important - for example if I open up a big program and I'm going to be doing lots of work in the program for hours then it's okay if the exe is even 6MB or sometimes 20MB depending on what jobs that big program can do. So size isn't always important - no one said that or claimed that. But, there are many many situations, many many of them - where I want to pop open 5 different GUI programs and kill them quickly, along with popping 5 command lines open and killing them quickly. At hacker pace. Not at 5-15 second pace. And I'm not talking just about tiny GUI programs that do one thing - I'm talking about opening up a GUI program that can do 200 things but is still fairly small and still fast loading.
What really bothers me is these same folks that claim size doesn't matter, are the same folks that are using compilers - the whole purpose of a compiler is to MAKE SOMETHING SMALLER IN BINARY COMPACT FORM. Otherwise, if size doesn't matter, you might as well just ship a 1-5MB interpreter gzipped with your application code, because that interpreter shipped with your application code is going to end up being smaller than the friggin bloated 4MB Lazarus exe!
So if you think size doesn't matter - then speed doesn't matter either, and there is really no purpose in using a compiler at all. You might as well use compiled byte code or ship the interpreter with each application. Then at least you reap all the benefits and rewards of interpreters too.. instead of shipping a useless huge binary that only does 50 things, is bigger than some interpreters themselves which can do 50000 things!
These same people that claim size doesn't matter are continually worrying about speed and efficiency - a total paradox! For example people will knock or bash people for worrying about size, but then a few days later you see them recommend someone to stop using records as results in functions, because a record result is less efficient than using Var (by reference) or Out parameter. You see these same people that claim size doesn't matter recommending FastCGI to people who would be better of just using CGI. You see these same people that claim interpreters are bad - but they themselves use the command line (a command line is an interpreter). You see these same people that claim run time interpreters are bad, use XML and CONFIG files in their applications (XML and config files are scripts that need to be interpreted. Instead binary records or binary structures should really be used, if they are true to their zealotry).
In other words - size does matter, we're all eventually compiling interpreters, and people are full of zealous shit.
Compilers and interpreters are both combined in all software, no matter how you try and separate them. If you can exploit the advantages of both, and not be stupid (such as shipping 3mB exe's which are bigger than some entire interpreters) then you will be a better hacker. Embedding an interpreter in a compiled program is common practice - for example when someone clicks a button, we are interpreting their click - when someone enters a search path into an edit box, we are interpreting their file path. When someone enters a search pattern into a find edit box, we are interpreting a small form of a regex. When someone opens a compiled program that makes use of an INI file, that compiled program is paradoxically using an internal interpreter - whether the compiler zealots like run time interpretation or not doesn't change the fact.
There is no line to be drawn. You can't draw a line and say that speed matters and size doesn't, or that interpreters are a no-no and that compilers are king. There is no complete separation of compiled code vs interpret ted code. There is no size vs speed comparison - size and speed are directly related. They are all related. If you are going to claim that speed matters, you better also claim that size matters - because they are related. And if you are going to claim that interpreting anything is bad, then you better look at all the mini interpreters inside your compiled code. There are a bunch of interpreters in our compiled code that act at run time - some people are just too blind to see them. Why are they too blind? Because they have their compiler glasses on or their speed glasses on, while at the same time using XML or button click events. Which is a paradox, since interpretation is being mixed with a compiled program. And that causes blindness for some people - they can't see or understand that what they are doing is both interpretation and compiling, and that speed and size sometimes do and don't matter.
What I'm getting at here is that programming is more complicated than just picking a certain zealotry to follow. Zealotries like "size isn't important" or zealotries such as "compilers should never use regular expressions because regular expressions are interpreted, not compiled". Hey, clicking a button after entering a search path with an asterisk pattern match is the same bloody thing - that is interpretation inside a compiled program!
A problem in the Pascal community is that I see a lot of Paradox going on. I see people claiming that size doesn't matter, and that compilers rule, and that interpreters suck, and that XML rules. How can interpreters suck, if parsing an XML file is interpreting? Yes, parsing a config file is interpreting too - at run time! You could have bloody compiled your structures into binary records instead of XML, you wanker - so stop bashing and stop being a zealot.
And stop saying that Size doesn't matter and that speed at run time does matter - especially those Lazarus zealots that think 3MB exe's are perfectly acceptable. Speed at run time is not necessarily as important as original load time - because many applications a hacker opens are only open for a few minutes or seconds and then closed down. Hackers are in a spur of the moment and need to open up a GUI to get X and Y jobs done quickly - 90 percent of the time. In fact, the whole idea of keyboard shortcuts is that they are faster at loading stuff open, right - not that keyboard shortcuts are faster algorithms.. They are faster at loading things. And loading things is directly related to executable size. So if you argue that size doesn't matter, you are also saying that keyboard shortcuts are useless and that using the mouse is better. You aren't saying that directly, but we can assume you mean that by doing logical deduction.
Make note: for those who aren't hackers, size of the exe doesn't matter as much. Some people are so slow with computers that opening up a program and waiting 15-30 seconds is perfectly fine. But hackers are pre-thinking stuff they are going to do before programs even open, and if the program takes 15 seconds we lose our pre-thoughts, because our pre-thoughts are sometimes jam-packed into our brain cells for only 5 seconds, until we have something else come in to our mind, such as remembering an unfinished program that needs to be completed some time.
Now, everyone is guilty of living a paradox - but as humans we can become less zealot and more realistic, and more understanding. Understand that size is not magically separate from speed, since size affects loading time and memory, which is related to speed. Understand that there is not even a fine line to draw between an interpreter and a compiler - they are actually exploited together and never used separately. A click is interpreted, and a path to a file location is interpreted, even in a compiled program.
And yes, I'm exaggerating, and yes things are getting better with time (as opposed to getting more bloated). And yes, I realize that sometimes I said 6mb and other times I said 3mb (it depends whether the strip command works and whether debugging is on and all that other annoying shit that the typical Delphi end user doesn't care about). No I'm not a typical delphi end user but look at Lazarus' target market.
The most ridiculous thing I've heard is that as you increase the speed you increase the size, which is total nonsense. Sometimes this could be the case but it is not a linear rule.
Fpc 2.2.0 is excellent example of how we can improve a compiler without making the end software more bloated. Fpc 2.2.0 adds more features but makes software leaner and faster, and it takes less time to compile. This proves that size and speed can be improved together, not just one or the other individually. Pipe dream? No.