10

I am used to object oriented programming. Now, I have just started learning unix bash scripting via linux.

I have a unix script with me. I wanted to break it down into "modules" or preferably programs similar to "more", "ls", etc., and then use pipes to link all my programs together. E.g., "some input" myProg1 | myProg2 | myProg3.

I want to organize my code and make it look neater, instead of all in one script. Also, it will be easy to do testing and development.

Is it possible to do this, especially as a newbie ?

Joshua Taylor
  • 84,998
  • 9
  • 154
  • 353
bashboy
  • 1,707
  • 5
  • 16
  • 13
  • 2
    You can use `source file` then the code in that file will be available. – hetepeperfan Jul 30 '13 at 19:16
  • 1
    That's a little difficult to answer in detail without knowing the script or its function. The general answer is: "probably". It would depend upon whether it makes logical sense to break the script's function into modules. Programs like `ls`, `more`, etc, each have a well defined, distinct purpose which make sense as a stand-alone function and as a piped capability. You'd need to decide if parts of your script are of that nature, then the answer would be, "Yes, it is possible to break it up into modules, and it makes sense." – lurker Jul 30 '13 at 19:16
  • @mbratch - Most of my desired modules will not really be general-use programs, just specific to my work. I only wanted convenience and ease of development and testing. – bashboy Jul 30 '13 at 19:29
  • Indeed, I was probably a bit restrictive in my language. Instead of "general purpose" I would say "single function" where "function" is a bit left up to you to define. The answer is probably "yes" but contingent upon exactly what your script does. – lurker Jul 30 '13 at 19:43
  • you can use `source` to include code (as @hetepeperfan mentioned), but shell scripts are in general not known for their great scalability. you might consider scripting languages (python, perl, ruby, ...) that are better suited for medium sized codebases. of course there are legitimate uses for shell scripts, even large ones, but you should think about it for a moment. – mnagel Jul 31 '13 at 12:39

3 Answers3

7

There are a few things you could take a look at, for example the usage of aliases in bash and storing them in either bashrc or a seperate file called by bashrc

that will make running commands easier..

take a look here for expanding commands into aliases (simple aliases are easy)

You can also look into using functions in your code (lots of bash scripts in above link's home folder to make sense of functions browse this site :) which has much better examples...

Take a look here for some piping tails into script pipe tail output into another script

The thing with bash is its flexibility, so for example if something starts to get too messy for bash you could always write a perl/Java any lang and then call this from within your bash script, capture its output and do something else..

Unsure why all the pipes anyways here is something that may be of help:

./example.sh 20
function one starts with 20
In function 2 20 + 10 = 30
Function three returns 10 + 10 = 40
------------------------------------------------

------------------------------------------------
Local function variables global:
Result2: 30 - Result3: 40 - value2: 10 - value1: 20

The script:

example.sh

#!/bin/bash

input=$1;

source ./shared.sh

one

echo "------------------------------------------------"



echo "------------------------------------------------"
echo "Local function variables global:"
echo "Result2: $result2 - Result3: $result3 - value2: $value2 - value1: $value1"

shared.sh

function one() {
        value1=$input
        echo "function one starts with $value1"
        two;
}

function two() {
        value2=10;
        result2=$(expr $value1 + $value2)
        echo "In function 2 $value1 + $value2 = $result2"
        three;
}
function three()  {
        local value3=10;
        result3=$(expr $value2 + $result2;)
        echo "Function three returns $value2 + $value3 = $result3"
}

I think the pipes you mean can actually be functions and each function can call one another.. and then you give the script the value which it passes through the functions..

bash is pretty flexible about passing values around, so long as the function being called before has the variable the next function being called by it can reuse it or it can be called from main program

I also split out the functions which can be sourced by another script to carry out the same functions

E2A Thanks for the upvote, I have also decided to include this link

http://tldp.org/LDP/abs/html/sample-bashrc.html

There is an awesome .bashrc to be reused, it has a lot of functions which will also give some insight into how to simplify a lot of daily repetitive commands such as that require piping, an alias can be written to do all of them for you..

Community
  • 1
  • 1
V H
  • 8,382
  • 2
  • 28
  • 48
  • Thanks, but i want to do everything in bash. – bashboy Jul 30 '13 at 19:34
  • that is fine so long as it is possible, bash script relies on other peoples compiled files i.e running nc or lsof or ps or anything as you already should know is other peoples compiled C files.. so unsure what the difference would be to call your program from another language... – V H Jul 30 '13 at 19:59
  • for example nc or ncat will not work for https or ssl connections, you could use elinks or links to do this or if you could not find something available, write something in perl/java or language of choice and call it from within your own bash script – V H Jul 30 '13 at 20:00
  • I have put a sample script that may be of help – V H Jul 30 '13 at 20:20
2

You can do one thing. Just as a C program can be divided into a header file and a source file for reducing complexity, you can divide your bash script into two scripts - a header and a main script but with some differences.

Header file - This will contain all the common variables defined and functions defined which will be used by your main script.

Your script - This will only contain function calls and other logic.You need to use "source <"header-file path">" in your script at starting to get all the functions and variables declared in the header available to your script.

Vishal R
  • 1,026
  • 1
  • 7
  • 21
0

Shell scripts have standard input and output like any other program on Unix, so you can use them in pipes. Splitting your scripts is a good solution because you can later use them in pipes with other commands.

I organize my Bash projects in the following way :

  • Each command is put in its own file
  • Reusable functions are kept in a library file which is just a classic script with only functions
  • All files are in the same directory, so commands can find the library with $(dirname $0)/library
  • Configuration is stored in another file as environment variables

To keep things clear, you should not use global variables to communicate between functions and main program.

I prepare a template for scripts with the following parts prepared :

  • Header with name and copyright
  • Read configuration with source
  • Load library with source
  • Check parameters
  • Function to display help, which is called if asked for or if parameters are wrong

My best advice is : always write the help function, as the next person who will need it is ... yourself !

To install your project you simply copy all files, and explain what to configure in the configuration file.

Mickaël Bucas
  • 683
  • 5
  • 20
  • I am not sure I understand the /organizational/ distinction you are making here between commands and functions (apart from their obvious technical differences). Is it a hierarchical relation? i.e. functions in libraries are used in commands, and user scripts use those library functions only indirectly via their use of commands? – Jacob Lee Apr 27 '17 at 20:11
  • I try to write functions that do simple things, so they can be seen as reusable building blocks for commands. In past projects I have used them for things like : - Log and error management - Database connection and query execution - Other specific tools like data loaders or schedulers basic functions – Mickaël Bucas May 09 '17 at 12:26
  • @MickaëlBucas This is exactly the sort of approach I was looking for! Do you happen to have any example repos you have set up like this? Thanks! – Jackson Holiday Wheeler Aug 03 '20 at 06:18
  • @Alacritas Unfortunately most projects with significant Shell scripts I've worked on are inside companies that don't publish code. I've some scripts at home but never took steps to publish them ! – Mickaël Bucas Aug 04 '20 at 08:55
  • @MickaëlBucas OK, thanks anyway! If you ever do push a repo like that to GitHub, GitLab, etc., please let me know. Thank you! – Jackson Holiday Wheeler Aug 05 '20 at 10:39