0

For Example I have a file like as follows .

A,Y29tLz9hPTQ2JmM9NDQzNzgmczE9Q0,123  
B,FJNLTA2MjQyMDE3LVAmczI9ODQ3MDA,321

I want to print field1,field2(by base 64 decoding),field3

Output Required ::

A,result of base 64 decode,123
B,result of base 64 decode,321
Alexander O'Mara
  • 58,688
  • 18
  • 163
  • 171
Jyothi Tulasi
  • 19
  • 1
  • 4
  • can some one help me out in this ? with out while loop please provide me the solution ? – Jyothi Tulasi Dec 13 '17 at 18:51
  • 1
    "without while loop"? Why? – Charles Duffy Dec 13 '17 at 18:52
  • If your goal is performance -- ie. having a single base64-decoding process, rather than starting a subprocess per line -- that requires a whole different approach (probably one best implemented by using Python or another language with native base64 decoding rather than shell). – Charles Duffy Dec 13 '17 at 18:53
  • 1
    Post the code that you have tried so far. Also, if your instructor places special requirements on how you solve this problem, let us know about those requirements. – John1024 Dec 13 '17 at 18:54
  • What you're asking for can't be done in shell alone, and the tools to achieve this vary by operating system. Include your work so far, and we'll help you debug it. – ghoti Dec 13 '17 at 19:03
  • the reason I requested with out while loop is , if we have millions of lines in a single file it will take more time to process . That's y i requested to provide the solution with out while loop – Jyothi Tulasi Dec 13 '17 at 19:21
  • I have tried using below script . awk -F "," ' {OFS=FS} { "echo "$2" | base64 --decode" | getline x; print $1,x, $3} ' file.txt > Outputfile – Jyothi Tulasi Dec 13 '17 at 19:28
  • @JyothiTulasi If you have "millions of lines" and your goal is speed, why use shell? – John1024 Dec 13 '17 at 19:29
  • @John1024 I am trying in both shell and perl . – Jyothi Tulasi Dec 13 '17 at 19:42
  • In perl i have tried like this . perl -MMIME::Base64 -ne 'printf "%s\n",decode_base64($_)' file – Jyothi Tulasi Dec 13 '17 at 19:49
  • Given input in the millions of lines, this is a job that shell is absolutely unsuited for. I wouldn't use perl either, but compared to shell, it's a winner. – Charles Duffy Dec 13 '17 at 19:56

2 Answers2

1

You can do this in a bash script with a few read commands and base64 -D:

#!/bin/bash

while read -r line
do
  IFS=',' read -r c1 c2 c3 <<< "$line"
  data="$(base64 -D <<< "$c2")"
  echo "$c1,$data,$c3"
done < "inputfile.txt"

Your one base64 strings has binary data in it though, so output may look funky due to control characters.

A,com/?a=46&c=44378&s1=,123
���KL
  �
�,321T  �̏N
Alexander O'Mara
  • 58,688
  • 18
  • 163
  • 171
  • Note "without while loop" in the OP's first comment. – Charles Duffy Dec 13 '17 at 18:57
  • BTW, personally, I'd use `IFS=, read -r first data rest` rather than `read -a` -- with the code as it is you're discarding fourth columns and onward if they exist, whereas if you don't use `read -a` it'll just put all remaining columns into the last one. – Charles Duffy Dec 13 '17 at 18:58
  • I'd also suggest `data=$(base64 -D <<<"${parts[1]}")` in place of the pipeline -- heredocs aren't free either, but they're usually less expensive than a FIFO from a subshell running `echo`. – Charles Duffy Dec 13 '17 at 18:59
0

give this awk one-liner a try:

awk -F',' -v OFS=',' '"echo "$2" | base64" | getline $2' file

Test with your example:

kent$  cat f
A,Y29tLz9hPTQ2JmM9NDQzNzgmczE9Q0,123  
B,FJNLTA2MjQyMDE3LVAmczI9ODQ3MDA,321

kent$  awk -F',' -v OFS=',' '"echo "$2" | base64" | getline $2' f
A,WTI5dEx6OWhQVFEySm1NOU5EUXpOemdtY3pFOVEwCg==,123  
B,RkpOTFRBMk1qUXlNREUzTFZBbWN6STlPRFEzTURBCg==,321
Kent
  • 189,393
  • 32
  • 233
  • 301
  • Hmm. It certainly meets the text of "without while loop", but absolutely violates the spirit (since it's running a separate `base64` command per line of input). – Charles Duffy Dec 13 '17 at 18:56
  • ...and we're vulnerable to shell injection attacks if we had `$(touch /tmp/i-am-evil)` in place of base64-encoded data. – Charles Duffy Dec 13 '17 at 18:56