-5

I have a file (text or csv) that I generate that has a list of part numbers. I need to take this list, and download some spec sheets for these parts automatically and then print. Once on the website, I need to input the part number, then print the results. What's the best way to do this?

Okay everyone, here's what I was doing before, but it would take over an hour to process on a progress 4gl (version 9.1) database into a QAD environment v8.6e on a Unix Red-Hat server:

FNAME=`date +%y%m%d%H%M%S`

echo requiredmcpartno=$1 | lynx -accept_all_cookies -nolist -dump -post_data 75.144.##.###/specdata/specdata.asp 2>&1 | tee $FNAME | lp -d$2   >>/apps/proedi/####/ftp/log/brownart.log

grep "Unit of Issue" $FNAME | cut --delimiter=: --fields=2 | awk '{print $1}'

grep -q "PACKAGING SPEC IS OBSOLETE FOR THIS PART NUMBER" "$FNAME"
if [ $? -eq 0 ]; then
  echo 0
  echo nopic
  exit
fi

cd /apps/proedi/####/ftp/ftpscripts
rm -fr 184.168.##.###/ 75.144.##.###/ www.google-analytics.com/ 2>&1   >>/apps/proedi/####/ftp/log/brownart.log
wget -p -m -k -K -E -H --cookies=on --post-data="requiredmcpartno=$1"   75.144.##.###/specdata/specdata.asp >/dev/null 2>&1

/apps/proedi/####/ftp/ftpscripts/printspec.sh $1 $2 >>/tmp/printspec.log 2>&1

cat /apps/proedi/####/ftp/ftpscripts/"$1".pt
rm -f /apps/proedi/####/ftp/ftpscripts/"$1".pt

 >>/apps/proedi/####/ftp/log/brownart.log
rm $FNAME 2>&1 >>/apps/proedi/ford/ftp/log/brownart.log

Then printspec.sh script: file=184.168.70.174/partandpackagingphotos/PhotoDetailsSpecdata.aspx\?p\=$1.html

if [ ! -f "$file" ]; then
  echo nopic >/apps/proedi/ford/ftp/ftpscripts/"$1".pt
  exit
fi

grep -q "No Pictures Available for this Part Number" "$file"
if [ $? -eq 0 ]; then
  echo nopic >/apps/proedi/ford/ftp/ftpscripts/"$1".pt
  exit
fi

html2ps -i .7 184.168.70.174/partandpackagingphotos/PhotoDetailsSpecdata.aspx\?p\=$1.html | lp d$2 -s

echo picfnd >/apps/proedi/ford/ftp/ftpscripts/"$1".pt

The wget command takes way to long to process in Unix. Our customer may send us a conveyance file with 150-200 parts 8-9 times a day, and we need to download all of the pictures associated with each part every time we receive parts.
I was thinking of just making a flat file(text or csv), then have the user run a batch file on their windows computer to connect to the unix server, and download the file to their computer. After this, then have either the same batch job, or an excel template or something on the windows side download the pictures and print the spec sheets to their default printer.

Sorry for not posting all of this initially.

Tom Bascom
  • 13,405
  • 2
  • 27
  • 33
Jim S.
  • 203
  • 8
  • 24
  • 2
    `What's the best way to do this?` With some code! You have been a user here long enough to know this is not a good question. What have you already tried? What specific issues did you run into? – admdrew Aug 04 '14 at 19:13
  • 1
    This is incredibly broad and will likely be closed. A much more specific question, usually about *code you've written*, will get you answers. Simply stating a problem and asking how it can be done does not show any research or coding effort on your side. Please show what you've tried so far. – tnw Aug 04 '14 at 19:13
  • 1
    @admdrew He did say "what's the best way?", implying some leeway for opinion. Therefore I'm of the opinion that you should turn the computer on first, then check power later. Though I did leave out the important step of grabbing a beer and reading a "how to write a website for dummies" book while waiting for the pizza rolls to finish heating. – mason Aug 04 '14 at 19:41
  • @mason Very true. You make some excellent points that have invalidated my previous comment. – admdrew Aug 04 '14 at 19:45

1 Answers1

0

The first thing that I would try is to break the process into two or more independent pieces and run them in parallel. The scripts above appear to take a part number as a parameter. I would guess that whatever is feeding them the part number is working from a list (the "conveyance file"?) That list would be the obvious place to make the split.

If you do it in such a way that the number of concurrent processes is configurable it should be simple to find the "sweet spot". Supposing that the list of parts to be downloaded is in a table called "part" with fields "needsDownload" and "partNum". For the sake of simplicity we will assume that partNum is an integer and that the actual part numbers needing download are randomly distributed. If you are driving this process with Progress 4GL code you might write a control program something like this:

/* control.p
 *
 * to run two "threads":
 *
 *   _progres -b dbName -p control.p -param "1,2" > control.1.log 2>&1 &      # 1 of 2
 *   _progres -b dbName -p control.p -param "2,2" > control.2.log 2>&1 &      # 2 of 2
 *
 *
 */  

define variable myThread   as integer no-undo.
define variable numThreads as integer no-undo.

myThread   = integer( entry( 1, session:parameter )) - 1.  /* this just allows the "1 of 2" stuff to be more "human sensible" */
numThreads = integer( entry( 2, session:parameter )).

for each part exclusive-lock where needsDownload = true and (( partNum modulo numThreads ) = myThread ):

  os-command value( "getpart.sh " + string( partNum )).
  needsDownload = false.

end.

Of course the problem might be that the external system is too slow. No amount of programming on your end will fix that.

Tom Bascom
  • 13,405
  • 2
  • 27
  • 33