Re: Note for Roger and Skywise - October 29, 2013
Posted by EQF on October 31, 2013 at 07:00:44:

I posted a note to the Python Newsgroup asking those questions. Some were already answered. I am still watching for a better answer regarding calculation speed.


Regarding one of your comments, this is where Perl has an advantage over many other programming languages.

When you chain Perl programs together using "Do" commands all of the data and arrays etc. are transferred automatically to the chained program.

So, if I want to load large files and then make changes to the code without having to reload the files I run a simple Perl program that calls a more complex program that loads and processes all the data. When it is done with that it returns control to the original program. Then I can make changes to the code of the more complex program and tell the original program to call it again. The data are still active in memory. And if I made a code error it tells me that and the more complex program refuses to compile. Corrections are made and the original program is told to call the more complex program again. The data remain active in memory in spite of the code errors.

If I need to see what the code errors were I can run the more complex program as a Windows DOS program. And the code errors are listed and remain on the active DOS window. Otherwise if the Perl program is run directly from Windows it displays error messages for just a second and then disappears before they can be read.

Those procedures save a lot of programming time. Programs can actually be developed while their data remain active in memory. This is important because some of my Perl programs can take an hour or more to get all of the data calculated and stored in the necessary arrays. And I don’t want to have to start over if a program code error is encountered.


Follow Ups:
     ● Re: Note for Roger and Skywise - October 29, 2013 - Skywise  16:48:57 - 10/31/2013  (101242)  (0)