1

Here its my script to read stores QTY depending on ITEM SID, I need to group all records which are related to same ITEM SID to be merged together in one line, find below example images for result and expected results.

SELECT DISTINCT
i.SBS_NO,
to_char(i.ITEM_SID) as ITEMSID,
i.ALU ,
a.LOT_NUMBER ,
a.LOT_NAME,
a.LOT_NOTE,
a.EXPIRY_DATE,
a.ACTIVE ,
--s.STORE_NO,
--s.QTY,
DECODE(S.STORE_NO,1,S.QTY,'0') STR1QTY,
DECODE(S.STORE_NO,2,S.QTY,'0') STR2QTY,
DECODE(S.STORE_NO,3,S.QTY,'0') STR3QTY,
DECODE(S.STORE_NO,4,S.QTY,'0') STR4QTY,
DECODE(S.STORE_NO,5,S.QTY,'0') STR5QTY,
DECODE(S.STORE_NO,6,S.QTY,'0') STR6QTY,
DECODE(S.STORE_NO,7,S.QTY,'0') STR7QTY,
DECODE(S.STORE_NO,0,S.QTY,'0') STR0QTY,
DECODE(S.STORE_NO,99,S.QTY,'0') WHQTY,

i.DESCRIPTION2,
i.DESCRIPTION3,
i.DESCRIPTION4,
i.DCS_CODE,
i.ATTR,
i.SIZ,
i.FST_RCVD_DATE,
i.UDF1_DATE,
i.QTY_PER_CASE,
i.ACTIVE,
i.MARKDOWN_PRICE,
i.UDF2_VALUE as ITEM_STATUS

from inventory_v i 
inner join  LOT a 
on i.ITEM_SID = a.ITEM_SID

inner join LOT_QTY s
on a.ITEM_SID=s.ITEM_SID

where i.sbs_no=1
and i.ITEM_SID=a.ITEM_SID
--and i. ALU='358N690175'
 and a.ITEM_SID=s.ITEM_SID
 and i.SBS_NO=a.SBS_NO
 and a.SBS_NO=s.SBS_NO
 and s.STORE_No in (0,1,2,3,4,5,6,7,99)
 and a.ACTIVE=1
 and i.ACTIVE=1

GROUP BY 
 i.SBS_NO,
 i.ITEM_SID,
 i.ALU ,
 a.LOT_NUMBER ,
 a.LOT_NAME,
 a.LOT_NOTE,
 a.EXPIRY_DATE,
 a.ACTIVE ,
 s.STORE_NO,
 s.QTY,
 i.DESCRIPTION2,
 i.DESCRIPTION3,
 i.DESCRIPTION4,
 i.DCS_CODE,
 i.ATTR,
 i.SIZ,
 i.FST_RCVD_DATE,
 i.UDF1_DATE,
 i.QTY_PER_CASE,
 i.ACTIVE,
 i.MARKDOWN_PRICE,
 i.UDF2_VALUE
 ;

THANKS


[here the current result ] enter image description here

[expected result]

enter image description here

marc_s
  • 732,580
  • 175
  • 1,330
  • 1,459
  • You should use [Aggregate functions](https://learn.microsoft.com/en-us/sql/t-sql/functions/aggregate-functions-transact-sql?view=sql-server-ver15) and not `DISTINCT`. – Luuk Aug 13 '20 at 06:53
  • You have a couple of cartesian products in your joins. `Inventory JOIN lot` has one row from inventory matching two rows from lot. `lot join lotqty` has one row from lot matching four rows from qty. Resolve these problems by adding extra conditions to the join or grouping/summing the rows from the right hand table – Caius Jard Aug 13 '20 at 06:58
  • Welcome to StackOverflow. :-) The current result shows that what you are trying to do is not easy. There is a lot of variation between the records for each ITEM SID so if you want one row per ITEM SID then you need to specify how the desired value for each column can be determined. – Simon G. Aug 13 '20 at 07:05
  • @SimonG it could potentially be as easy as `AND a.type = 123` - we can't tell because none of the original columns of sample data are included, there is only "here is my broken SQL, and here is my broken results, and here is is a result set of what I want with zero explanation of how e.g. the str1qty that is full of zeroes magically becomes 2". This SQL looks like it's had a bit of "coding by trial and error" thrown at it to- there's a useless distinct and join predicates repeated in the where clause when really what it needs is the source of the cartesian explosion fixing and proper SUM()ming – Caius Jard Aug 13 '20 at 08:08
  • This looks a lot like a pivot attempt. Check out these resources: [pivot tutorial](https://www.oracletutorial.com/oracle-basics/oracle-pivot/) and [pivot feature](https://www.oracle.com/technical-resources/articles/database/sql-11g-pivot.html). Also, add a tag that specifies your SQL implementation (Oracle SQL, Microsoft SQL Server, ...). – Sander Aug 13 '20 at 08:33
  • first of all thanks for all suggestions . @CaiusJard actually as you can see Simon G said (Welcome to StackOverflow ) that's mean this is the first question for this user and he join this community to learn and get more experience ,you can ask me for more details if you are trying to help or stay calm and let someone else give his suggestion , anyway thanks for your positive comment :) ... – Abdulaziz aldabbagh Aug 13 '20 at 10:09
  • @Abdulazizaldabbagh yes, it hadn't escaped my notice that you're new to the site. I promise you in years to come you'll look back at this question and understand why it attracted only comments and no answerthus far; it isn't answerable in its current form because questions that are of the form "here is my broken solution, and here is my broken output, how do i fix it?" generally aren't - there is nowhere near enough detail. We can't look at an output where you've thrown an indeterminate amount of input detail away and proffer a solution that correctly generates the required output – Caius Jard Aug 13 '20 at 12:18
  • Please spend some time on a site like https://dbfiddle.uk/ assembling a fiddle that a) creates all the same tables as your question has, b) has a representative sample of the actual input data your query is working on, inserted into those tables, c) has this query that produces that exact "wrong" output you showed in your question, and *then* we'll be able to tell you why your query doesn't work and what to do to fix it. Right now other than "there must be some input data we can't see", noone in the world can tell why your desired str1qty column contains a 2 when the input is all 0 – Caius Jard Aug 13 '20 at 12:21
  • I've done as much as I can so far in terms of advice; your joins are faulty - they connect one row on the left to too many rows on the right. This is a cartesian product error, and you can read about it at length in various places on the web but in essence if you have 1 person and he has 4 addresses, 3 of which are marked as deleted and you forgot to say `and isdeleted = 0` then you'll get 4 rows out. If you've only done "select person.name" then you'll be looking at "john john john john", and the answer is NOT to smash a DISTINCT in there "to dedupe it", it's to *fix the broken join* – Caius Jard Aug 13 '20 at 12:26
  • You now need to post more detail, or remove that distinct, and that group by and put a SELECT * and look at what ALL the data coming out of your joins is, and WHY when you joined in rows from lot and lotqty you got more rows than you wanted. 99% you've forgotten something, 1% the table erroneously contains duplicated data that needs removing – Caius Jard Aug 13 '20 at 12:28

0 Answers0