How can I separate a file with columns 1-7 of data into ONE column with all the data in order?

For example, I have all of this data but i want it organized in a way such that the output file is purely numbers, with rows 1-7 corresponding to columns 1-7, then rows 8-14 corresponding to columns 1-7 on the second row, and etc.

Can I do this using awk?

Also

Example of data:

Total 31.6459262.4011 31.6463 31.6463 0.0006 0.0006 0.0007 Total 0.0007 0.0007 0.0007 0.0007 0.0007 0.0008 0.0008 Total 0.0008 0.0008 0.0008 0.0008 0.0008 0.0008 0.0008 Total 0.0008 0.0008 0.0008 0.0008 0.0008 0.0008 0.0008 Total 0.0008 0.0007 0.0007 0.0007 0.0006 0.0006 0.0006 Total 0.0005 0.0005 0.0004 0.0003 0.0003 0.0002 0.0001 Total 0.0001 0.0000 -0.0001 -0.0002 -0.0002 -0.0003 -0.0004 Total -0.0005 -0.0006 -0.0007 -0.0008 -0.0009 -0.0010 -0.0011 Total -0.0011 -0.0012 -0.0013 -0.0014 -0.0015 -0.0015 -0.0016 Total -0.0016 -0.0017 -0.0018 -0.0018 -0.0018 -0.0019 -0.0019 Total -0.0019 -0.0019 -0.0020 -0.0020 -0.0020 -0.0020 -0.0020 Total -0.0019 -0.0019 -0.0019 -0.0019 -0.0018 -0.0018 -0.0018 Total -0.0017 -0.0017 -0.0017 -0.0016 -0.0016 -0.0015 -0.0015 Total -0.0014 -0.0014 -0.0013 -0.0012 -0.0012 -0.0011 -0.0011 Total -0.0010 -0.0010 -0.0009 -0.0009 -0.0008 -0.0008 -0.0007 Total 31.6459262.4010 31.6461 31.6462 0.0006 0.0006 0.0006 Total 0.0007 0.0007 0.0007 0.0007 0.0007 0.0007 0.0007 Total 0.0008 0.0008 0.0008 0.0008 0.0008 0.0008 0.0008

The output is lengthy to type, but it would consist of all these numbers arranged in one column without the four numbers that repeat every so often, 31.6459, 262.4010, 31.6461, and 31.6462. These four numbers are not always exactly the same, but they are certainly always greater than ~20. And they do repeat every 101 numbers.

Output:

0.0006 0.0006 0.0007 0.0007 0.0007 0.0007 0.0007 0.0007 0.0008 0.0008 0.0008 0.0008 0.0008 0.0008 0.0008 0.0008 0.0008 0.0008 0.0008 0.0008 0.0008 0.0008 0.0008 0.0008 0.0008 0.0007 0.0007 0.0007 0.0006 0.0006 0.0006 0.0005 0.0005 0.0004 0.0003 0.0003 0.0002 0.0001 0.0001 0.0000 -0.0001 -0.0002 -0.0002 -0.0003 -0.0004 -0.0005 -0.0006 -0.0007 -0.0008 -0.0009 -0.0010 -0.0011 -0.0011 -0.0012 -0.0013 -0.0014 -0.0015 -0.0015 -0.0016 -0.0016 -0.0017 -0.0018 -0.0018 -0.0018 -0.0019 -0.0019 -0.0019 -0.0019 -0.0020 -0.0020 -0.0020 -0.0020 -0.0020 -0.0019 -0.0019 -0.0019 -0.0019 -0.0018 -0.0018 -0.0018 -0.0017 -0.0017 -0.0017 -0.0016 -0.0016 -0.0015 -0.0015 -0.0014 -0.0014 -0.0013 -0.0012 -0.0012 -0.0011 -0.0011 -0.0010 -0.0010 -0.0009 -0.0009 -0.0008 -0.0008 -0.0007 0.0006 0.0006 0.0006 0.0007 0.0007 0.0007 0.0007 0.0007 0.0007 0.0007 0.0008 0.0008 0.0008 0.0008 0.0008 0.0008 0.0008

-------------Problems Reply------------

There are PLENTY of numbers that repeat frequently in your data so we can't exclude the ones you mention based on them repeating so - do you want exclude numbers with value >= 20?

If so, this may be what you want using GNU awk for FIELDWIDTHS:

$ awk 'BEGIN{FIELDWIDTHS="8 8 8 8 8 8 8 8"}
{for (i=2;i<=NF;i++) if ($i<20) {sub(/^ +/,"",$i); print $i} }' file
0.0006
0.0006
0.0007
0.0007
0.0007
0.0007
0.0007
0.0007
0.0008
0.0008
0.0008
0.0008
0.0008
0.0008
0.0008
0.0008
0.0008
0.0008
0.0008
0.0008
0.0008
0.0008
0.0008
0.0008
0.0008
0.0007
0.0007
0.0007
0.0006
0.0006
0.0006
0.0005
0.0005
0.0004
0.0003
0.0003
0.0002
0.0001
0.0001
0.0000
-0.0001
-0.0002
-0.0002
-0.0003
-0.0004
-0.0005
-0.0006
-0.0007
-0.0008
-0.0009
-0.0010
-0.0011
-0.0011
-0.0012
-0.0013
-0.0014
-0.0015
-0.0015
-0.0016
-0.0016
-0.0017
-0.0018
-0.0018
-0.0018
-0.0019
-0.0019
-0.0019
-0.0019
-0.0020
-0.0020
-0.0020
-0.0020
-0.0020
-0.0019
-0.0019
-0.0019
-0.0019
-0.0018
-0.0018
-0.0018
-0.0017
-0.0017
-0.0017
-0.0016
-0.0016
-0.0015
-0.0015
-0.0014
-0.0014
-0.0013
-0.0012
-0.0012
-0.0011
-0.0011
-0.0010
-0.0010
-0.0009
-0.0009
-0.0008
-0.0008
-0.0007
0.0006
0.0006
0.0006
0.0007
0.0007
0.0007
0.0007
0.0007
0.0007
0.0007
0.0008
0.0008
0.0008
0.0008
0.0008
0.0008
0.0008

I feel like you could have come up with a briefer example btw.

Category:bash Views:128 Time:2019-03-22
Tags: bash awk sed

Related post

Copyright (C) dskims.com, All Rights Reserved.

processed in 0.081 (s). 12 q(s)