Use of Extract datasets
Hello,
Having seen various instances of the EXTRACT command in some legacy code I've got around to trying out this command but I can't get it to work properly. Here's some condensed test code I've put together:
TABLES: VBAK, VBAP.
FIELD-GROUPS: HEADER, TOP, DETAIL.
INSERT: VBAK-VBELN VBAP-POSNR INTO HEADER,
VBAK-ERDAT VBAK-ERNAM VBAK-KUNNR INTO TOP,
VBAP-MATNR VBAP-CHARG VBAP-NETWR VBAP-WAERK INTO DETAIL.
PARAMETERS P_VKORG TYPE VBAK-VKORG.
SELECT *
FROM VBAK
WHERE VKORG = P_VKORG.
EXTRACT TOP.
SELECT *
FROM VBAP
WHERE VBELN = VBAK-VBELN.
EXTRACT DETAIL.
ENDSELECT.
ENDSELECT.
CLEAR: VBAK, VBAP.
SORT.
LOOP.
WRITE:/ VBAK-VBELN, VBAP-POSNR, VBAK-ERDAT, VBAK-ERNAM, VBAK-KUNNR, VBAP-MATNR.
ENDLOOP.
OK, forget about the nested selects and other bad stuff, but how come I get the following output, and how should it be corrected:
1360043260 000000 06.05.2008 CARTEM2 304393 <<< extraneous line that doesn't exist
1360043260 000010 06.05.2008 CARTEM2 304393 34123376
1360043260 000020 06.05.2008 CARTEM2 304393 50492316
1360043260 000030 06.05.2008 CARTEM2 304393 50406766
Use of extracts seems a bit wierd and there's not a lot out there to properly explain how to use them (apart from examples using a logical database).
Thanks,
Chris.
Hi,
When the first EXTRACT statement occurs in a program, the system creates the extract dataset and adds the first extract record to it. In each subsequent EXTRACT statement, the new extract record is added to the dataset.
So instead of extract, modify the above code in two ways:
1) Select-End Select, Comment out the extract statement and fetch in a work area if data is fetched in workarea append it to an internal table.
2) Use Select into table, where in you can fetch all the data into an internal table and then use it for processing.
For better performance, use the second method. Below is how you can change.
SELECT *
FROM VBAK
INTO TABLE LT_VBAK
WHERE VKORG = P_VKORG.
SELECT *
FROM VBAP
INTO TABLE LT_VBAP
FOR ALL ENTRIES IN LT_VBAK
WHERE VBELN = VBAK-VBELN.
LOOP AT LT_VBAP INTO LS_VBAP.
READ TABLE LT_VBAK WITH KEY VBELN = LS_VBAP-VBELN
IF SY-SUBRC EQ 0.
WRITE:/ LS_VBAK-VBELN, LS_VBAP-POSNR, LS_VBAK-ERDAT, LS_VBAK-ERNAM, LS_VBAK-KUNNR, LS_VBAP-MATNR.
ENDIF.
ENDLOOP.
Hope this helps you
Regards
Shiva
Similar Messages
-
Hello Dear ABAP Ace's,
Please let me know if what are differences between Extract datasets, Internal Tables, & Field Groups????? And what are the similarities?? Also let me know the uses of extract datasets & Field groups???
Thanks in advance.
Regards.
FarooqHi,
There are two ways of processing large quantities of data in ABAP - either using internal tables or extract datasets.
An internal table is a dynamic sequential dataset in which all records have the same structure and a key. They are part of the ABAP type concept. You can access individual records in an internal table using either the index or the key.
Extracts are dynamic sequential datasets in which different lines can have different structures. Each ABAP program may currently only have a single extract dataset. You cannot access the individual records in an extract using key or index. Instead, you always process them using a loop.
Check these links :
http://help.sap.com/saphelp_nw2004s/helpdata/en/9f/db9ede35c111d1829f0000e829fbfe/frameset.htm
http://www.geocities.com/SiliconValley/Grid/4858/sap/ABAPCode/Fieldgroups.htm
http://help.sap.com/saphelp_nw04/helpdata/en/fc/eb3ca6358411d1829f0000e829fbfe/frameset.htm
Regards
L Appana -
Hi
I have written the following sample code using extract dataset.But i am not getting my expected o/p.Kindly correct me where iam wrong..
report z_test_extract.
data: begin of wa1,
eno type i,
ename(20) type c,
sal type i,
end of wa1,
begin of wa2,
sno type i,
sname(20) type c,
end of wa2.
field-groups: fs1,
fs2,
header.
insert wa1 into fs1.
insert wa2 into fs2.
insert wa1-eno wa1-ename wa1-sal into header.
wa1-eno = 1000.wa1-ename = 'abdul hakim'.wa1-sal = 10000.extract fs1.
wa1-eno = 2000.wa1-ename = 'abdul aleem'.wa1-sal = 20000.extract fs1.
wa1-eno = 3000.wa1-ename = 'sayyed'.wa1-sal = 30000.extract fs1.
wa1-eno = 4000.wa1-ename = 'abdul'.wa1-sal = 40000.extract fs1.
wa2-sno = 1000.wa2-sname = 'Mr.xyz'.extract fs2.
wa2-sno = 2000.wa2-sname = 'Mr.zyx'.extract fs2.
sort by wa1-sal.
loop.
at end of wa1-sal.
write:/ 'Sum:',Sum(wa1-sal).
endat.
at last.
write:/ 'Gross:',sum(wa1-sal).
endat.
endloop.
<b>Current Output:</b>
Sum:10000
Sum:20000
Sum:30000
Sum:120000
Gross:180000.
<b>Expected Output:</b>
Sum:10000
Sum:20000
Sum:30000
Sum:40000
Gross:100000.
Thanks,
Abdul HakimHi,
The HEADER field symbol has wa1-eno,wa1-ename,
wa1-sal as the fields. While storing the values for FS2, the values for the header is filled up with the last value of FS1.
(i.e) wa1-eno = 4000.wa1-ename = 'abdul'.wa1-sal = 40000.
so, it is finding the summation for three times.
The only way is to fill up the header details , while filling up FS2. Otherwise put the statements which fill up details for FS2 before filling for FS1.
I have modified the code as shown below
data: begin of wa1,
eno type i,
ename(20) type c,
sal type i,
end of wa1,
begin of wa2,
sno type i,
sname(20) type c,
end of wa2.
field-groups: fs1,
fs2,
header.
insert wa1 into fs1.
insert wa2 into fs2.
insert wa1-eno wa1-ename wa1-sal into header.
*Fill up before filling up FS1
wa2-sno = 1000.wa2-sname = 'Mr.xyz'.extract fs2.
wa2-sno = 2000.wa2-sname = 'Mr.zyx'.extract fs2.
wa1-eno = 1000.wa1-ename = 'abdul hakim'.wa1-sal = 10000.extract fs1.
wa1-eno = 2000.wa1-ename = 'abdul aleem'.wa1-sal = 20000.extract fs1.
wa1-eno = 3000.wa1-ename = 'sayyed'.wa1-sal = 30000.extract fs1.
wa1-eno = 4000.wa1-ename = 'abdul'.wa1-sal = 40000.extract fs1.
sort by wa1-sal.
loop.
at end of wa1-sal.
write:/ 'Sum:',Sum(wa1-sal).
endat.
at last.
write:/ 'Gross:',sum(wa1-sal).
endat.
endloop.
Regards,
M.Saravanan -
Spry Menu Using Nested XML Dataset (Spry 1.6)
I have a vertical menu with a few items. One of which is
labeled Products which has submenus. I want to have that submenu
read from a Nested XML dataset. Using a single dataset for one
level in a menu is easy enough, but the subenu will have submenus.
Example Menu:
Home
Company
Products
|-- Product 1
|-- Item 1
|-- Item 2
|-- Item 3
|-- Product 2
|-- Item 1
|-- Item 2
|-- Product 3
|-- Item 1
|-- Item 2
|-- Item 3
|-- Item 4
|-- Product 4
|-- Item 1
|-- Item 2
|-- Product 5
I have been looking for an easy way to use the Spry Nested
XML Dataset to create the Product/Item menu. The number of Products
may vary as well as the number of Items in each Product submenu
(also, some Products may not have Items).
I already have an ASP page that creates the XML data from a
database.
Schema follows (XSD ):
<?xml version="1.0" encoding="utf-8"?>
<xsd:schema xmlns:xsd="
http://www.w3.org/2001/XMLSchema">
<xsd:element name="products">
<xsd:complexType>
<xsd:sequence>
<xsd:element name="product_type"
maxOccurs="unbounded">
<xsd:complexType>
<xsd:sequence>
<xsd:element name="type_name" maxOccurs="1"
type="xsd:string"/>
<xsd:element name="type_url" maxOccurs="1"
type="xsd:anyURI"/>
<xsd:element name="product_name"
maxOccurs="unbounded">
<xsd:complexType>
<xsd:sequence>
<xsd:element name="item_name" maxOccurs="1"
type="xsd:string"/>
<xsd:element name="item_url" maxOccurs="1"
type="xsd:anyURI"/>
</xsd:sequence>
</xsd:complexType>
</xsd:element>
</xsd:sequence>
</xsd:complexType>
</xsd:element>
</xsd:sequence>
</xsd:complexType>
</xsd:element>
</xsd:schema>
I have been programming for 17 years but am new to Spry. If
this cannot be done easily with the Spry framework, I'll probably
wind up splitting the XML data into 2 files (Products and Items)
then writing a nested loop and call each by row, but then I have to
find out how Spry Datasets reference XML data. Figuring out how to
call rows from the XML data shouldn't be so bad, but this method
just seems like such a hassle for something that should be easy.
Pseudocode follows:
j=1
i=1 to TotalNumberProducts
display Product i from Products
ItemsExist=true
While ItemsExist
if j > TotalNumberItems | Item j is not for Product then
ItemsExist=false
else
display Item j from Items
j++
Wend
Next
Thanks in advance for any help or direction!That's exactly what I'm trying to do. However, I implemented
that code and the submenus won't appear. I suspect the submenus
aren't finding the field names from the Nested XML Dataset. The
first level of Product menus work great (this is a submenu of the
overall menu) and correctly identify products that do not have
submenus, so I know it's picking up the number of records in the
Nested Dataset correctly - it just won't display the data in the
next level of menu.
variable and script declarations:
<script src="SpryAssets/SpryMenuBar.js"
type="text/javascript"></script>
<script src="SpryAssets/xpath.js"
type="text/javascript"></script>
<script src="SpryAssets/SpryData.js"
type="text/javascript"></script>
<script src="SpryAssets/SpryNestedXMLDataSet.js"
type="text/javascript"></script>
<link href="SpryAssets/SpryMenuBarVertical.css"
rel="stylesheet" type="text/css">
<script type="text/javascript">
<!--
var productMenuData = new
Spry.Data.XMLDataSet("products.asp", "products/product_type");
var productMenuDataItems = new
Spry.Data.NestedXMLDataSet(productMenuData, "product_name");
//-->
</script>
Code for menus:
<ul id="NavMenu" class="MenuBarVertical">
<li><a
href="index.html">Home</a></li>
<li><a
href="company.html">Company</a></li>
<li><a href="franco_giberti.html">Franco
Giberti</a></li>
<li><a class="MenuBarItemSubmenu"
href="products.asp">Products</a>
<ul spry:region="productMenuData
productMenuDataItems">
<li spry:repeat="productMenuData"><a
class="MenuBarItemSubmenu" href="{type_url}"
spry:if="{productMenuDataItems::ds_RowCount} !=
0">{type_name}</a> <a href="{type_url}"
spry:if="{productMenuDataItems::ds_RowCount} ==
0">{type_name}</a>
<ul spry:if="{productMenuDataItems::ds_RowCount} !=
0">
<li spry:repeat="productMenuDataItems"><a
href="{productMenuDataItems::item_url}">{productMenuDataItems::item_name}</a></li>
</ul>
</li>
</ul>
</li>
<li><a href="contact.html">Contact Us</a>
<!-- end #sidebar1 -->
</li>
</ul>
XML:
<products
xsi:noNameSpaceSchemaLocation="products.xsd">
−
<product_type>
<type_name>Pasta Sauce</type_name>
<type_url>pt_2.asp</type_url>
−
<product_name>
<item_name>Putenesca</item_name>
<item_url>pn_3.asp</item_url>
</product_name>
−
<product_name>
<item_name>Arrabiata</item_name>
<item_url>pn_4.asp</item_url>
</product_name>
−
<product_name>
<item_name>Pesto</item_name>
<item_url>pn_5.asp</item_url>
</product_name>
−
<product_name>
<item_name>Basil and Tomato</item_name>
<item_url>pn_6.asp</item_url>
</product_name>
−
<product_name>
<item_name>Bolognese</item_name>
<item_url>pn_7.asp</item_url>
</product_name>
−
<product_name>
<item_name>Carboniera</item_name>
<item_url>pn_8.asp</item_url>
</product_name>
</product_type>
+
<product_type>
<type_name>Organic Olive Oil</type_name>
<type_url>pt_3.asp</type_url>
−
<product_name>
<item_name>Original</item_name>
<item_url>pn_9.asp</item_url>
</product_name>
−
<product_name>
<item_name>Basil</item_name>
<item_url>pn_10.asp</item_url>
</product_name>
−
<product_name>
<item_name>Herbs</item_name>
<item_url>pn_11.asp</item_url>
</product_name>
−
<product_name>
<item_name>Sun Dried Tomato</item_name>
<item_url>pn_12.asp</item_url>
</product_name>
</product_type>
+
<product_type>
<type_name>Organic Spreads</type_name>
<type_url>pt_4.asp</type_url>
−
<product_name>
<item_name>Putenesca</item_name>
<item_url>pn_13.asp</item_url>
</product_name>
−
<product_name>
<item_name>Arrabiata</item_name>
<item_url>pn_14.asp</item_url>
</product_name>
−
<product_name>
<item_name>Pesto</item_name>
<item_url>pn_15.asp</item_url>
</product_name>
−
<product_name>
<item_name>Basil and Tomato</item_name>
<item_url>pn_16.asp</item_url>
</product_name>
−
<product_name>
<item_name>Bolognese</item_name>
<item_url>pn_17.asp</item_url>
</product_name>
−
<product_name>
<item_name>Carboniera</item_name>
<item_url>pn_18.asp</item_url>
</product_name>
</product_type>
+
<product_type>
<type_name>Organic Grilled Vegetables</type_name>
<type_url>pt_5.asp</type_url>
−
<product_name>
<item_name>Putenesca</item_name>
<item_url>pn_19.asp</item_url>
</product_name>
−
<product_name>
<item_name>Arrabiata</item_name>
<item_url>pn_20.asp</item_url>
</product_name>
−
<product_name>
<item_name>Pesto</item_name>
<item_url>pn_21.asp</item_url>
</product_name>
−
<product_name>
<item_name>Basil and Tomato</item_name>
<item_url>pn_22.asp</item_url>
</product_name>
−
<product_name>
<item_name>Bolognese</item_name>
<item_url>pn_23.asp</item_url>
</product_name>
−
<product_name>
<item_name>Carboniera</item_name>
<item_url>pn_24.asp</item_url>
</product_name>
</product_type>
−
<product_type>
<type_name>Truffle Products</type_name>
<type_url>pt_6.asp</type_url>
</product_type>
</products>
Any further guidance would be very much appreciated! -
Error while using query in Dataset to retreive unique data
Hi,
I have added the below query in the EBS dataset to retrieve only unique applications as APPLICATION_NAME column has some duplicate data. I imported data to DB.
select distinct(APPLICATION_NAME) as NAME from APPLICATION
But, when I try to filter on application I get below error
Internal Exception: java.sql.SQLSyntaxErrorException: ORA-00936: missing expression
Error Code: 936
Call: select distinct(APPLICATION_NAME) as NAME from APPLICATION WHERE UPPER ( DISTINCT(APPLICATION_NAME) ) like UPPER('%%')
Query: DataReadQuery(sql="select distinct(APPLICATION_NAME) as NAME from APPLICATION WHERE UPPER ( DISTINCT(APPLICATION_NAME) ) like UPPER('%%') ").
[2012-01-06T12:23:19.054+00:00] [WLS_OIM1] [WARNING] [] [oracle.adfinternal.view.faces.lifecycle.LifecycleImpl] [tid: [ACTIVE].ExecuteThread: '1' for queue: 'weblogic.kernel.Default (self-tuning)'] [userId: xelsysadm] [ecid: 7ca16c37caf00ffe:39d560f4:134b0a19524:-8000-0000000000003f01,0] [APP: oim#11.1.1.3.0] ADF_FACES-60098:Faces lifecycle receives unhandled exceptions in phase INVOKE_APPLICATION 5[[
oracle.iam.platform.canonic.base.NoteException: An error occurred while executing the lookup query.
at oracle.iam.platform.canonic.agentry.GenericEntityLookupActor.perform(GenericEntityLookupActor.java:337)
at oracle.iam.consoles.faces.render.canonic.UIValue$UIEntitySelector.search(UIValue.java:1736)
at oracle.iam.consoles.faces.render.canonic.UIValue$UIEntitySelector.access$2400(UIValue.java:1467)
at oracle.iam.consoles.faces.render.canonic.UIValue$EntitySelectorQueryListener.processQuery(UIValue.java:1787)
at oracle.adf.view.rich.event.QueryEvent.processListener(QueryEvent.java:67)
at org.apache.myfaces.trinidad.component.UIXComponentBase.broadcast(UIXComponentBase.java:675)
at oracle.adf.view.rich.component.UIXQuery.broadcast(UIXQuery.java:108)
at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl.broadcastEvents(LifecycleImpl.java:902)
at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl._executePhase(LifecycleImpl.java:313)
at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl.execute(LifecycleImpl.java:186)
at javax.faces.webapp.FacesServlet.service(FacesServlet.java:265)
at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:300)
at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:27)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
at oracle.help.web.rich.OHWFilter.doFilter(Unknown Source)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
at oracle.adf.model.servlet.ADFBindingFilter.doFilter(ADFBindingFilter.java:205)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
at oracle.adfinternal.view.faces.webapp.rich.RegistrationFilter.doFilter(RegistrationFilter.java:106)
at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl$FilterListChain.doFilter(TrinidadFilterImpl.java:447)
at oracle.adfinternal.view.faces.activedata.AdsFilter.doFilter(AdsFilter.java:60)
at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl$FilterListChain.doFilter(TrinidadFilterImpl.java:447)
at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl._doFilterImpl(TrinidadFilterImpl.java:271)
at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl.doFilter(TrinidadFilterImpl.java:177)
at org.apache.myfaces.trinidad.webapp.TrinidadFilter.doFilter(TrinidadFilter.java:92)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
at oracle.security.wls.filter.SSOSessionSynchronizationFilter.doFilter(SSOSessionSynchronizationFilter.java:277)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
at oracle.iam.platform.auth.web.PwdMgmtNavigationFilter.doFilter(PwdMgmtNavigationFilter.java:122)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
at oracle.iam.platform.auth.web.OIMAuthContextFilter.doFilter(OIMAuthContextFilter.java:108)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
at oracle.adf.library.webapp.LibraryFilter.doFilter(LibraryFilter.java:176)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
at oracle.security.jps.ee.http.JpsAbsFilter$1.run(JpsAbsFilter.java:111)
at oracle.security.jps.util.JpsSubject.doAsPrivileged(JpsSubject.java:313)
at oracle.security.jps.ee.util.JpsPlatformUtil.runJaasMode(JpsPlatformUtil.java:413)
at oracle.security.jps.ee.http.JpsAbsFilter.runJaasMode(JpsAbsFilter.java:94)
at oracle.security.jps.ee.http.JpsAbsFilter.doFilter(JpsAbsFilter.java:161)
at oracle.security.jps.ee.http.JpsFilter.doFilter(JpsFilter.java:71)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
at oracle.dms.servlet.DMSServletFilter.doFilter(DMSServletFilter.java:136)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
at weblogic.servlet.internal.RequestEventsFilter.doFilter(RequestEventsFilter.java:27)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3715)
at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3681)
at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2277)
at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2183)
at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1454)
at weblogic.work.ExecuteThread.execute(ExecuteThread.java:209)
at weblogic.work.ExecuteThread.run(ExecuteThread.java:178)
[2012-01-06T12:23:19.055+00:00] [WLS_OIM1] [ERROR] [] [oracle.adfinternal.view.faces.config.rich.RegistrationConfigurator] [tid: [ACTIVE].ExecuteThread: '1' for queue: 'weblogic.kernel.Default (self-tuning)'] [userId: xelsysadm] [ecid: 7ca16c37caf00ffe:39d560f4:134b0a19524:-8000-0000000000003f01,0] [APP: oim#11.1.1.3.0] ADF_FACES-60096:Server Exception during PPR, #3[[
oracle.iam.platform.canonic.base.NoteException: An error occurred while executing the lookup query.
at oracle.iam.platform.canonic.agentry.GenericEntityLookupActor.perform(GenericEntityLookupActor.java:337)
at oracle.iam.consoles.faces.render.canonic.UIValue$UIEntitySelector.search(UIValue.java:1736)
at oracle.iam.consoles.faces.render.canonic.UIValue$UIEntitySelector.access$2400(UIValue.java:1467)
at oracle.iam.consoles.faces.render.canonic.UIValue$EntitySelectorQueryListener.processQuery(UIValue.java:1787)
at oracle.adf.view.rich.event.QueryEvent.processListener(QueryEvent.java:67)
at org.apache.myfaces.trinidad.component.UIXComponentBase.broadcast(UIXComponentBase.java:675)
at oracle.adf.view.rich.component.UIXQuery.broadcast(UIXQuery.java:108)
at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl.broadcastEvents(LifecycleImpl.java:902)
at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl._executePhase(LifecycleImpl.java:313)
at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl.execute(LifecycleImpl.java:186)
at javax.faces.webapp.FacesServlet.service(FacesServlet.java:265)
at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:300)
at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:27)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
at oracle.help.web.rich.OHWFilter.doFilter(Unknown Source)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
at oracle.adf.model.servlet.ADFBindingFilter.doFilter(ADFBindingFilter.java:205)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
at oracle.adfinternal.view.faces.webapp.rich.RegistrationFilter.doFilter(RegistrationFilter.java:106)
at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl$FilterListChain.doFilter(TrinidadFilterImpl.java:447)
at oracle.adfinternal.view.faces.activedata.AdsFilter.doFilter(AdsFilter.java:60)
at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl$FilterListChain.doFilter(TrinidadFilterImpl.java:447)
at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl._doFilterImpl(TrinidadFilterImpl.java:271)
at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl.doFilter(TrinidadFilterImpl.java:177)
at org.apache.myfaces.trinidad.webapp.TrinidadFilter.doFilter(TrinidadFilter.java:92)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
at oracle.security.wls.filter.SSOSessionSynchronizationFilter.doFilter(SSOSessionSynchronizationFilter.java:277)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
at oracle.iam.platform.auth.web.PwdMgmtNavigationFilter.doFilter(PwdMgmtNavigationFilter.java:122)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
at oracle.iam.platform.auth.web.OIMAuthContextFilter.doFilter(OIMAuthContextFilter.java:108)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
at oracle.adf.library.webapp.LibraryFilter.doFilter(LibraryFilter.java:176)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
at oracle.security.jps.ee.http.JpsAbsFilter$1.run(JpsAbsFilter.java:111)
at oracle.security.jps.util.JpsSubject.doAsPrivileged(JpsSubject.java:313)
at oracle.security.jps.ee.util.JpsPlatformUtil.runJaasMode(JpsPlatformUtil.java:413)
at oracle.security.jps.ee.http.JpsAbsFilter.runJaasMode(JpsAbsFilter.java:94)
at oracle.security.jps.ee.http.JpsAbsFilter.doFilter(JpsAbsFilter.java:161)
at oracle.security.jps.ee.http.JpsFilter.doFilter(JpsFilter.java:71)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
at oracle.dms.servlet.DMSServletFilter.doFilter(DMSServletFilter.java:136)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
at weblogic.servlet.internal.RequestEventsFilter.doFilter(RequestEventsFilter.java:27)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3715)
at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3681)
at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2277)
at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2183)
at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1454)
at weblogic.work.ExecuteThread.execute(ExecuteThread.java:209)
at weblogic.work.ExecuteThread.run(ExecuteThread.java:178)
Can anyone please tell me how to get unique/distinct applications using query in dataset.
Thanks in advance.887188 wrote:
Hi,
I have added the below query in the EBS dataset to retrieve only unique applications as APPLICATION_NAME column has some duplicate data. I imported data to DB.
select distinct(APPLICATION_NAME) as NAME from APPLICATIONIs there any table in OIM schema which you have named APPLICATION? AFAIK, applications from EBS are imported into Lookups in OIM and you will have to query lookup to get the Application name in OIM.
But, when I try to filter on application I get below error
Internal Exception: java.sql.SQLSyntaxErrorException: ORA-00936: missing expression
Error Code: 936
Call: select distinct(APPLICATION_NAME) as NAME from APPLICATION WHERE UPPER ( DISTINCT(APPLICATION_NAME) ) like UPPER('%%') The UPPER ( DISTINCT(APPLICATION_NAME) ) like UPPER('%%') clause here is wrong. What are you trying to achieve through this query?
>
>
-Bikash -
Which extraction methods are used for extracting AR , AP, GL , cost centre
which extraction methods are used for extracting AR , AP, GL, cost centre account data from R/3 system.
please let me know what type of extraction we use...generic or FISL.... OR COPA
ON WHAT SCENARIOS WE USE THIS EXTRACTIONS..
if any have documents on it please email...please send an email documents ..
to [email protected]
iam bit confused with sap help...
will reward full points
please replyHi,
For general ledger :
http://help.sap.com/saphelp_nw70/helpdata/en/57/dd153c4eb5d82ce10000000a114084/frameset.htm
This is the best how-to guide on AP,AR,GL and TAX.
http://help.sap.com/saphelp_nw04/helpdata/en/af/16533bbb15b762e10000000a114084/frameset.htm
Hope it helps.
Regards,
Srikanth. -
Whenever I measure the amplitude of a signal using the Extract Single Tone VI, the amplitude is smaller than if I measured the signal by hand by positioning the top and bottom cursors to the highest and lowest peaks of the captured waveform. Should I not use the Extract Single Tone VI to measure amplitude? I'm using Labview 6i.
I'm not exactly sure but I think the amplitude from the Extract Single Tone vi is in peak volts (Vp). When you use the cursors on a scope set to the top and bottom peaks, you are measuring volts peak to peak (Vp-p). If the Extract vi amplitude is one half of your scope reading, then this is true. Just double your Extract vi amplitude to get the peak to peak voltage. It also depends on the signal. If your signal is a square wave from 0 to 5 volts (TTL), this is usually measured in peak voltage. A sinewave going positive and negative (from -5 to +5) is usually measured in peak to peak.
- tbob
Inventor of the WORM Global -
Using XML extraction from Oracle and XSLT data transformation
Hi
How can transfer data ie: Using XML extraction from Oracle and XSLT data transformation with java application?
usually i use to do querying sql, getting data from table assinging to model class then send it to UI. how can i go for XML extraction form oracle?
thanksSorry, I don't understand what exactly you want to do. And I'm under the impression that you might not know exactly what you want to do as well. Could you explain a bit more detailed what you want to achieve?
-
Using IMAQ Extract on a Calibrated Image
Hello,
I'm a student new to Labview working on a project to get a web cam to identify and analyze the state of a checkers game board. My VI calibrates the image of the board to correct for distortion and to fit it to its real world coordinates. At this point I want to extract the game board from its surroundings to make the rest of my analysis easier. However, if I do this I end up voiding my calibration. Is there a way that I can make a new image based on my calibrated image so I can use the extract VI or do I need a new approach?
Thanks for your time!It sounds like you're not passing the calibrated image to the extract VI correctly. The extract VI requires a new memory allocation be made and then copies the source image to that location. It then creates a new image at the new memory location with only the extracted portion of the original image. If you don't handle your memory locations correctly, you may be overwriting your original image instead of extracting from it. Make sure you are allocating a new memory location by calling a new IMAQ Create VI (and using a different string input) and feeding that into the Image Dst input of the IMAQ Extract VI. This should ensure that you save the calibrated image before running the extraction.
Regards,
Chris L
Applications Engineer
National Instruments
Certified LabVIEW Associate Developer -
Which tables are being used to extract the data for the materi Extractors
Hi
please advise which tables are used to extract the data for the extarctors
0MAT_SALES_ATTR,
0MATERIAL_ATTR,
0MAT_PLANT_ATTR
0PLANT_ATTR
,Many thanks.Hi,
Activate a database trace using ST01. Then perform the MM03 operation and goto the SD Texts tab. Now stop the trace and see if you can locate either a table or program which you can utilize in the log.
http://help.sap.com/saphelp_47x200/helpdata/en/dd/55f993545a11d1a7020000e829fd11/frameset.htm
Hope this helps..
Rgs,
Ravikanth. -
hi,
Sample code for Creating Extract Dataset
thnks.HI,
a program that lists the Vendors and their Accounting documents. Create extract dataset from KDF logical database. Loop through the dataset to create the required report. Don't list those vendors which has no documents
See the code:
report zfwr0001 no standard page heading.
tables: lfa1, bsik.
field-groups: header, item1, item2.
insert lfa1-lifnr bsik-belnr into header.
insert lfa1-land1 lfa1-name1 into item1.
insert bsik-belnr bsik-budat into item2.
start-of-selection.
get lfa1.
....extract item1.
get bsik.
....extract item2.
end-of-selection.
loop.
....at item1 with item2.
........skip.
........write:/ 'Vendor number:', 28 'Name:', 56 'City:'.
........write: 16 lfa1-lifnr, 33(20) lfa1-name1, 62(20) lfa1-ort01.
........write:/ 'Document no.', 15 'Date'.
....endat.
....at item2.
........write:/ bsik-belnr, 13 bsik-budat.
....endat.
endloop.
Cheers,
Chandra Sekhar. -
Setting path of java after using self extracting binary.. pls help
i have installed java in linux. i used self extracting binary because i wanted to install java in a specific folder user -> local -> apps.
now my doubt is. how can i set the path.
i have installed java using rpm then i had edited the .bashrc file.
can i follow the same step in this case also?
i am not a root user but i can have all the privilages bu using the sudo commandYou can set it using ~/.bashrc or ~/.bash_profile. This will apply to you only, not the other users on the system
-
Identity abap report where is used the open dataset istruction
We need to identify the abap program where is used the open dataset istruction.
We need this because we are planning a platform change from windows to unix.
Regards
GiuseppeHi,
I am not sure but we check it either manually or by search keyword. Secondry we can use code inspector where we put the open dataset keyword. That will find all.
If you got some more information then pls. also share with me
Regds,
Rakesh -
Problem in Extracting Dataset from Z FM for Zdatasource
hi,
I have created a Z fm of extracting data in Z datasource and i ahve written code for it.
But everytime it returns me
0 dataset extracted...
i debugged it....and found data in E_T_DTFIAR_1
and till dat it is working fine ( wat i m assuming),
after dat it is going to some other forms(like data transfer ) and some events.
and finally throwing 0 data record found.
here is the code..........can anyone tell me wat i m missing in this code...
FUNCTION ZFIE*.
""Local interface:
*" IMPORTING
*" VALUE(I_DSOURCE) TYPE SBIWA_S_INTERFACE-ISOURCE
*" VALUE(I_REQUNR) TYPE SBIWA_S_INTERFACE-REQUNR OPTIONAL
*" VALUE(I_MAXSIZE) TYPE SBIWA_S_INTERFACE-MAXSIZE OPTIONAL
*" VALUE(I_INITFLAG) TYPE SBIWA_S_INTERFACE-INITFLAG OPTIONAL
*" VALUE(I_UPDMODE) TYPE SBIWA_S_INTERFACE-UPDMODE OPTIONAL
*" VALUE(I_DATAPAKID) TYPE SBIWA_S_INTERFACE-DATAPAKID OPTIONAL
*" TABLES
*" I_T_SELECT TYPE SBIWA_T_SELECT OPTIONAL
*" I_T_FIELDS TYPE SBIWA_T_FIELDS OPTIONAL
*" E_T_DTFIAR_1 STRUCTURE ZFIE_ (extractor structure) OPTIONAL
*" EXCEPTIONS
*" NO_MORE_DATA
*" ERROR_PASSED_TO_MESS_HANDLER
TABLES: VBREVE, ZFIE_EXT_REV, VBAK, VBRP, VBPA, KNA1..
*....... steering flags
STATICS: L_CURSOR TYPE CURSOR,
L_OPEN_CURSOR_FLAG LIKE C_OFF,
L_LAST_DATA_FLAG LIKE C_OFF.
*....... PACKAGE-SIZE for SELECT-Statement
STATICS: L_PACKAGE_SIZE LIKE SY-TABIX.
first call - initialization **********************
IF NOT ( I_INITFLAG IS INITIAL ).
IF NOT ( G_FLAG_INTERFACE_INITIALIZED IS INITIAL ).
**.... Invalid second initialization call -> error exit
IF 1 = 2. MESSAGE E008(R3). ENDIF. "only for Where-used list
LOG_WRITE 'E' "message type
'R3' "message class
'008' "message number
' ' "message variable 1
' '. "message variable 2
RAISE ERROR_PASSED_TO_MESS_HANDLER.
ENDIF.
*.. check DataSource validity
CASE I_DSOURCE.
WHEN C_ISOURCE_DTFIAR_1.
WHEN OTHERS.
IF 1 = 2.
MESSAGE E009(R3) WITH I_DSOURCE. "only for Where-used list
ENDIF.
LOG_WRITE 'E' "message type
'R3' "message class
'009' "message number
I_DSOURCE "message variable 1
' '. "message variable 2
RAISE ERROR_PASSED_TO_MESS_HANDLER.
ENDCASE.
*.. Check for supported update mode
CASE I_UPDMODE.
WHEN 'F'.
WHEN OTHERS.
IF 1 = 2.
MESSAGE E011(R3) WITH I_UPDMODE. "only for Where-used list
ENDIF.
LOG_WRITE 'E' "message type
'R3' "message class
'011' "message number
I_UPDMODE "message variable 1
' '. "message variable 2
RAISE ERROR_PASSED_TO_MESS_HANDLER.
ENDCASE.
APPEND LINES OF I_T_SELECT TO G_T_SELECT.
*.. Fill parameter buffer for data extraction calls
G_S_INTERFACE-REQUNR = I_REQUNR.
G_S_INTERFACE-ISOURCE = I_DSOURCE.
G_S_INTERFACE-MAXSIZE = I_MAXSIZE.
G_S_INTERFACE-INITFLAG = I_INITFLAG.
G_S_INTERFACE-UPDMODE = I_UPDMODE.
G_S_INTERFACE-DATAPAKID = I_DATAPAKID.
G_FLAG_INTERFACE_INITIALIZED = SBIWA_C_FLAG_ON.
Fill field list table for an optimized select statement
(in case that there is no 1:1 relation between InfoSource fields
and database table fields this may be far from beeing trivial)
APPEND LINES OF I_T_FIELDS TO G_T_FIELDS.
*.. calculate PACKAGE-SIZE for SELECT-Statement (approximate)
L_PACKAGE_SIZE = G_S_INTERFACE-MAXSIZE / 12.
*.. fill selection criterias to global ranges
second and further calls - data selection *************
ELSE.
*.. clear table for export data
CLEAR: E_T_DTFIAR_1.
REFRESH: E_T_DTFIAR_1.
IF L_OPEN_CURSOR_FLAG IS INITIAL.
*.... open cursor
OPEN CURSOR WITH HOLD L_CURSOR FOR
SELECT
REFERENCE_DOC REF_DOC_ITM GL_ACCOUNT YEAR_PERIOD AMOUNT_DOC_CURR
CURRENCY PROFIT_CENTER GL_OFFSET BUSINESS_AREA COMPANY_CODE
REVENUE_STATUS SOLD_TO FISCAL_YEAR PERIOD DOCUMENT_TYPE
AMOUNT_COMPANY AMOUNT_PROFT_CTR BILLING_DATE PLANT SALES_OFFICE
SALES_GROUP SALES_DISTRICT END_USER_COUNTRY PROD_HIERARCHY
MATERIAL MATERIAL_COST SALES_ORG DISTR_CHANNEL NUMBER_OF_NODES
ORDER_DATE MATERIAL_GROUP AAG CONTRACT QUANTITY DAF_NUMBER
BILLING_DATE
FROM ZFIE_EXT_REV
WHERE
RECORD_TYPE ='DEF'.
SELECT SINGLE VBELN FROM VBREVE INTO ITAB_REVENUE-REFDOCNR.
L_OPEN_CURSOR_FLAG = C_ON.
ENDIF. "L_OPEN_CURSOR_FLAG = C_OFF
IF L_PACKAGE_SIZE <> 0.
*.... fetch next package
FETCH NEXT CURSOR L_CURSOR
APPENDING CORRESPONDING FIELDS OF TABLE LT_DEF_REV
PACKAGE SIZE L_PACKAGE_SIZE.
*.... process selected data
PERFORM PROCESS_SEL_DATA_AR1 TABLES LT_DEF_REV E_T_DTFIAR_1.
*.... check, if cursor has to be closed
DESCRIBE TABLE LT_DEF_REV LINES SY-TFILL.
IF SY-TFILL LT L_PACKAGE_SIZE.
CLOSE CURSOR L_CURSOR.
L_LAST_DATA_FLAG = 'X'.
ENDIF.
ENDIF.
ENDIF.
ENDFUNCTION.
FORM PROCESS_SEL_DATA_AR1 *
FORM PROCESS_SEL_DATA_AR1 TABLES SEL_DATA STRUCTURE LT_DEF_REV
EXP_DATA STRUCTURE ZFIE_BIW_DEF_HIS.
*....... local data declarations
if not SEL_DATA[] is initial.
move sel_data[] to exp_data[].
SELECT POPUPO VBELN_N POSNR_N RVAMT ACCPD PAOBJNR SAKUR SAMMG
REFFLD ERDAT ERZET BUDAT REVFIX
APPENDING CORRESPONDING FIELDS OF TABLE it_vbreve
FROM VBREVE
WHERE VBELN = ITAB_REVENUE-REFDOCNR
AND POSNR = SEL_DATA-REF_DOC_ITM
AND BUKRS = SEL_DATA-COMPANY_CODE
AND BDJPOPER = SEL_DATA-YEAR_PERIOD.
*move it_vbreve[] to exp_data[].
*move exp_data to e_t_data.
ENDif.
ENDFORM.
any help! plz..
rdgs,
San!Hi San,
I cannot see the source of your error, but I would suggest you look at the FM RSVD_BW_GET_DATA, and merge you logic with the code from there.
I would then test this with RSA3 to make sure you everything working correctly. Cheers! Bill -
I_UPDMODE has no value in my Function Module when using Delta Extraction
Help me please.
My system is BW 3.52
Please see the source code below and tell me why I_UPDMODE has not been passed value. I have ever used "I_SOURCE" but the value pass to I_DSOURCE. Can anyone tell me where is the upload mode pass to?
FUNCTION ZBWFN_TEST_DELTA.
""Local Interface:
*" IMPORTING
*" VALUE(I_REQUNR) TYPE SBIWA_S_INTERFACE-REQUNR
*" VALUE(I_DSOURCE) TYPE SBIWA_S_INTERFACE-ISOURCE OPTIONAL
*" VALUE(I_MAXSIZE) TYPE SBIWA_S_INTERFACE-MAXSIZE OPTIONAL
*" VALUE(I_INITFLAG) TYPE SBIWA_S_INTERFACE-INITFLAG OPTIONAL
*" VALUE(I_UPDMODE) TYPE SBIWA_S_INTERFACE-UPDMODE OPTIONAL
*" VALUE(I_DATAPAKID) TYPE SBIWA_S_INTERFACE-DATAPAKID OPTIONAL
*" VALUE(I_RLOGSYS) TYPE SRSC_S_INTERFACE-RLOGSYS OPTIONAL
*" VALUE(I_READ_ONLY) TYPE SRSC_S_INTERFACE-READONLY OPTIONAL
*" TABLES
*" I_T_SELECT TYPE SBIWA_T_SELECT OPTIONAL
*" I_T_FIELDS TYPE SBIWA_T_FIELDS OPTIONAL
*" E_T_DATA STRUCTURE ZISU_ERCHC OPTIONAL
*" EXCEPTIONS
*" NO_MORE_DATA
*" ERROR_PASSED_TO_MESS_HANDLER
This extractor is part of a delta scenario based on a timestamp
included in the fields of table ROVERCUBE1. The interesting part
takes place in form get_time_interval, where the date range is
calculated update modespecifically.
The pointer for the date up to which delta was extracted during
the last delta update is held in table ROBWQTSTAT.
TABLES: ZISU_TP_ERCHC, ERCH, ERCHC.
Auxiliary Selection criteria structure
DATA: L_S_SELECT TYPE SBIWA_S_SELECT.
DATA: L_ERCHC LIKE ZISU_TP_ERCHC OCCURS 0 WITH HEADER LINE.
DATA: L_DATE LIKE SY-DATUM,
L_ACTUAL_DATE LIKE SY-DATUM,
L_LAST_DATE LIKE SY-DATUM.
Maximum number of lines for DB table
STATICS: L_MAXSIZE TYPE SBIWA_S_INTERFACE-MAXSIZE,
BEGIN OF S_S_INTERFACE.
INCLUDE TYPE SBIWA_S_INTERFACE.
INCLUDE TYPE SRSC_S_INTERFACE.
STATICS: END OF S_S_INTERFACE.
STATICS: BEGIN OF S_R_TSTMP OCCURS 1,
SIGN(1),
OPTION(2),
LOW LIKE ROVERCUBE1-TSTMP,
HIGH LIKE ROVERCUBE1-TSTMP,
END OF S_R_TSTMP.
Initialization mode (first call by SAPI) or data transfer mode
(following calls) ?
IF I_INITFLAG = SBIWA_C_FLAG_ON.
Invalid second initialization call -> error exit
IF NOT G_FLAG_INTERFACE_INITIALIZED IS INITIAL.
IF 1 = 2. MESSAGE E008(R3). ENDIF.
LOG_WRITE 'E' "message type
'R3' "message class
'008' "message number
' ' "message variable 1
' '. "message variable 2
RAISE ERROR_PASSED_TO_MESS_HANDLER.
ENDIF.
Check DataSource validity
CASE I_DSOURCE.
WHEN 'ZOVER_TRANS'.
WHEN 'TEST_ROVERCUBE'.
WHEN 'DO_DATASOURCE'.
WHEN '0VER_DELTA_WITH_LONG_NAME'.
WHEN '0VER_CUBE_OLD_LIS'.
WHEN '0VER_TYPE_ATTR'.
WHEN OTHERS.
IF 1 = 2. MESSAGE E009(R3). ENDIF.
LOG_WRITE 'E' "message type
'R3' "message class
'009' "message number
I_DSOURCE "message variable 1
' '. "message variable 2
RAISE ERROR_PASSED_TO_MESS_HANDLER.
ENDCASE.
Check for supported update mode
CASE I_UPDMODE.
WHEN 'F'.
WHEN 'D'.
WHEN 'C'.
WHEN 'R'.
WHEN 'S'.
WHEN OTHERS.
IF 1 = 2. MESSAGE E011(R3). ENDIF.
LOG_WRITE 'E' "message type
'R3' "message class
'011' "message number
I_UPDMODE "message variable 1
' '. "message variable 2
RAISE ERROR_PASSED_TO_MESS_HANDLER.
ENDCASE.
APPEND LINES OF I_T_SELECT TO G_T_SELECT.
Fill parameter buffer for data extraction calls
S_S_INTERFACE-REQUNR = I_REQUNR.
S_S_INTERFACE-ISOURCE = I_DSOURCE.
S_S_INTERFACE-MAXSIZE = I_MAXSIZE.
S_S_INTERFACE-INITFLAG = I_INITFLAG.
S_S_INTERFACE-UPDMODE = I_UPDMODE.
S_S_INTERFACE-RLOGSYS = I_RLOGSYS.
S_S_INTERFACE-READONLY = I_READ_ONLY.
G_FLAG_INTERFACE_INITIALIZED = SBIWA_C_FLAG_ON.
APPEND LINES OF I_T_FIELDS TO G_T_FIELDS.
here the timerange for update modes concerning delta is calculated
and the status table is updated
PERFORM GET_CAL_INTERVAL TABLES G_R_DELTA_DATE[]
USING S_S_INTERFACE-ISOURCE
S_S_INTERFACE-UPDMODE
S_S_INTERFACE-RLOGSYS.
ELSE. "Initialization mode or data extraction ?
Data transfer: First Call calcualte range tables for key fields
calculate date range due to update mode
OPEN CURSOR + FETCH
Following Calls FETCH only
First data package -> OPEN CURSOR
G_COUNTER_DATAPAKID = G_COUNTER_DATAPAKID + 1.
IF G_COUNTER_DATAPAKID = 1.
Fill range tables.
LOOP AT G_T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'COUNTRY'.
MOVE-CORRESPONDING L_S_SELECT TO L_R_COUNTRY.
APPEND L_R_COUNTRY.
ENDLOOP.
LOOP AT G_T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'REGION'.
MOVE-CORRESPONDING L_S_SELECT TO L_R_REGION.
APPEND L_R_REGION.
ENDLOOP.
LOOP AT G_T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'KUNNR'.
MOVE-CORRESPONDING L_S_SELECT TO L_R_KUNNR.
APPEND L_R_KUNNR.
ENDLOOP.
LOOP AT G_T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'TYPE'.
MOVE-CORRESPONDING L_S_SELECT TO L_R_TYPE.
APPEND L_R_TYPE.
ENDLOOP.
LOOP AT G_T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'GJAHR'.
MOVE-CORRESPONDING L_S_SELECT TO L_R_GJAHR.
APPEND L_R_GJAHR.
ENDLOOP.
no data must be selected in Init simulation mode
CHECK S_S_INTERFACE-UPDMODE NE SRSC_C_UPDMODE_INITSIMU.
Determine number of database records to be read per FETCH statement
from input parameter I_MAXSIZE.
L_MAXSIZE = G_S_INTERFACE-MAXSIZE.
REFRESH: L_ERCHC.
SELECT * FROM ERCH WHERE ERDAT IN G_R_DELTA_DATE
OR AEDAT IN G_R_DELTA_DATE.
SELECT SINGLE * FROM ERCHC WHERE BELNR = ERCH-BELNR.
IF SY-SUBRC = 0.
CLEAR: L_ERCHC.
L_ERCHC-BUKRS = ERCH-BUKRS.
L_ERCHC-ABRVORG = ERCH-ABRVORG.
L_ERCHC-PORTION = ERCH-PORTION.
L_ERCHC-GPARTNER = ERCH-GPARTNER.
IF ERCHC-CPUDT IN G_R_DELTA_DATE.
L_ERCHC-DELDT = ERCHC-CPUDT.
L_ERCHC-DOCDT = ERCHC-BUDAT.
L_ERCHC-RELNO = 1.
COLLECT L_ERCHC.
ENDIF.
IF ERCHC-INTCPUDT IN G_R_DELTA_DATE AND
ERCHC-INTCPUDT IS NOT INITIAL.
L_ERCHC-DELDT = ERCHC-INTCPUDT.
L_ERCHC-DOCDT = ERCHC-INTBUDAT.
L_ERCHC-REVNO = 1.
COLLECT L_ERCHC.
ENDIF.
ENDIF.
ENDSELECT.
DELETE FROM ZISU_TP_ERCHC.
LOOP AT L_ERCHC.
MOVE-CORRESPONDING L_ERCHC TO ZISU_TP_ERCHC.
INSERT ZISU_TP_ERCHC.
ENDLOOP.
OPEN CURSOR WITH HOLD G_CURSOR FOR
SELECT * FROM ZISU_TP_ERCHC.
ENDIF. "First data package ?
IF S_S_INTERFACE-UPDMODE = SRSC_C_UPDMODE_INITSIMU.
RAISE NO_MORE_DATA.
ENDIF.
Fetch records into interface table.
FETCH NEXT CURSOR G_CURSOR
APPENDING CORRESPONDING FIELDS OF TABLE E_T_DATA
PACKAGE SIZE S_S_INTERFACE-MAXSIZE.
IF SY-SUBRC <> 0.
RAISE NO_MORE_DATA.
ENDIF.
ENDIF. "Initialization mode or data extraction ?
ENDFUNCTION.Dave,
1. You can fire SELECTS in an RFC as well, but in your case the data exists in SYSTEM A and the RFC is in System B, so you can't do that. You can fire SELECTS on tables in the same system.
2. Quick example of two table loops - EKKO (HEADER) EKPO (ITEM).
LOOP AT EKKO.
LOOP AT EKPO WHERE EBELN = EKKO-EBELN.
ENDLOOP.
ENDLOOP.
I hope this is clear now.
Regards,
Ravi
Maybe you are looking for
-
Like the title says. When I turn it on it goes to setup. Then I select my country, then ask me to connect to wifi or to a computer. Then after a screen appears with a title " Activate iPod" then below it says exactly "This ipod is currently linked to
-
Share files between Arch Linux and XP Home
Hi, I have an Arch Linux desktop and a XP Home Laptop both connected to a Billion 7401 ADSL Router to get to the internet. If I am using any computer sometimes I have a need to use files from the other one, and from an external HDD connected to the X
-
JBO-35007: Row currency has changed
I have a search page that returns result to the same page. Result iterator is: <iterator id="SearchPersonsIterator" RangeSize="10" Binds="SearchPersons" DataControl="AppModuleDataControl" RefreshCondition="${adfFacesContext.postback}" Refresh="render
-
Moving iPhoto 6 ALBUMS to ext. HD
Using iPhoto 6 on Power Mac G4 w/ OS 10.3.9. I have my photos arranged in ALBUMS. How can I back up these ALBUMS onto my external HD? I would like to back up my organization efforts as ALBUMS. Preserving the organization is critical. G4 Quicksilver
-
Splitting 3D model into layers
Hi, I have a 3D model of a boat, created in a program called DelftShip and exported as a .obj file, that i would like to render using Photoshop CS6. However, the .obj file does not include layers, so i get the whole model as one layer in PS. So the w