Can I programmaly write data to Data Source Filter in view External List?
Hi, everybody. I need a help with external list. View in external list has parameter named "Data Source Filter" for write filter data for BCS query to DB. I wanna programmaly write data to this parameter but I haven't any ideas about class, property
or method what can write data to it. Any idea?
Aleksandr Shramko
Hi Alexksandr Shramko,
To set the filter value for a SharePoint common list, SPView.SetViewXml property is used to set the filter value. Following thread contains sample to help get a better understanding about how to use it:
http://social.technet.microsoft.com/Forums/sharepoint/en-US/0ddca0f6-2b17-4f4a-9d66-f3d684ed4a11/how-to-set-orderedview-of-the-list-view-programmatically?forum=sharepointdevelopmentprevious
However, I haven’t make it work for external list. In this situation, I would suggest you to custom control to display the list item instead of customizing the existing list view filter. You can choose to use gridview or other table to show the data.
A sample here:
http://stackoverflow.com/questions/7647941/retriving-splistitem-through-external-lists-whose-source-is-from-sql-server-08
Thanks,
Qiao Wei
TechNet Community Support
Similar Messages
-
How to Insert Data in Database using BCS with out External List
Hi,
How to Insert,Update and Delete User Interface data using business connectivity service with out External List.Please suggest me solution.
Regards,
khadarOnce you've configured the external content type, you can interact with it using the BDC Object Models available to you. You can use server side or client side and interact with the database without an external list.
Check these links:
Server side:
http://msdn.microsoft.com/en-us/library/office/ff464357(v=office.14).aspx#sectionSection3
Client Side: http://msdn.microsoft.com/library/jj164116.aspx
Brandon Atkinson
Blog: http://sharepointbrandon.com -
Editing date in Organizer - but how can i 'hard' write that new date to the actual files?
Hi, i have a number of files which were created by digitizing about 40 anolog tapes. These tapes have been 'split' into clips(largish) using AV cutty software [it worked v well] and therefore have about 25 avi files per tape.
I have now imported all of these from folders) into Organizer to tag etc before creating projects in Premiere.
However, I can amend the dates in Organizer (some of the clips have timestamp on the screen) BUT i want this to be hard-coded back to the original files - is there any way to do this. Otherwise will i have to edit the dates of the clips in Windows Explorer, using something like AttributeMagic
Thanks in advance - and hope this helps others at some point too.
wj
PS I fully understand that Orginzer is purely referential, but is there NO way to allow the data (eg tags/dates to be writtrn back to the source files?)
I wonder if I should look at other video software tools?PC drives geerally are formatted MBR (master boot record) on the drive and NTFS partitions.
Mac can read NTFS but not write it unless you buy additional software.
Do you want to keep that drive's files for read access, or is your PC-world done? -
Can you use SQL as a data source for a project in the same way you can in Excel?
Excel allows you to create a data source that executes a SQL stored procedure, display that data as a table in a spreadsheet and have that data automatically refresh each time you open the spreadsheet. Is it possible to do the same thing in MS Project, displaying
the data from the stored procedure as a series of tasks?
Here's what I'm trying to do - I have a stored procedure that pulls task data meeting a specific criteria from all projects in Project Server. We're currently displaying this data as an Excel report. However, the data includes start dates and durations so
it would be nice to be able to display it as a Gantt Chart. I've played around with creating a Gantt chart in Excel and have been able to do a very basic one, but it doesn’t quite fit our needs.No, You can not use sql as a data source for a project.
You have 3 options to achieve it:
1. You can create a Sharepoint list with desired column ,fill desired data in that list then you can create a MS project from Sharepoint List.
2. You can create a SSRS report in which you can display grantt chart Joe has given you that link.
3. You can write a macro in MPP which will take data from your excel. In excel you will fetch data from your stored procedure. write a schedule which will run every day to update your data or
create an excel report in which will update automatically and write macro in mpp which will fetch the data then publish it so that it would be available to team members.
kirtesh -
Can we merge data from multiple sources in Hyperion Interactive Reporting ?
Hi Experts,
Can we merge data from multiple sources in Hyperion Interactive Reporting ?Example can we have a report based on DB2
Oracle,
Informix and Multidiemnsional Databases like DB2,MSOLAP,ESSBASE?
Thanks,
VYes, Each have their own Query and have some common dimension for the Results Sections to be joined together in a final query.
look in help for Creating Local Joins -
How can I use notifications to send data from different sources to thesame chart?
Hi,
I am using the "Continuous Measurement and Logging" template project that ships with LV 2013.
It is extremenly helpful for figuring out messaging between the acquire, graph, and log loops. (Thanks NI!)
I've run into a snag however.
I would like to modify it so that my graphing loop receives data notifications from two acquisition sources via notifiers.
I'm having trouble getting data from both sources to display on the same chart.
I've isolated the issue in the attached vi.
Here's what happens:
1. I create data from 2 parallel loops and send the data to a third parallel loop with notifiers.
2. The third loop receives data from only one of the loops because one of the receiving notifiers just times out instead of receiving data.
Can someone suggest how I can fix this?
Thanks.
- Matt
Solved!
Go to Solution.
Attachments:
test notification loop.vi 42 KBHere is my modification of your VI. I put notes on the block diagram to explain the changes. It uses a queue for the data transfer to avoid loss of data. It uses a notifier to stop the loops. All local variables and Value property nodes have been eliminated.
The way the loops are stopped will likely leave some data in the queue. No more than one or two iterations of each of the data acquisition loops. If you need to guarantee that all data has been displayed (or saved in a real application), then you need to stop the acquisition loops first and read the queue until you know it is empty and both of the other loops have stopped. Then stop the display loop and release the queue and notifier.
Lynn
Attachments:
test notification loop.ldj.vi 1036 KB -
Can we transport Data mart data sources?
Hello Experts,
I have question on Data Marts.
Can we transport Data mart data sources? If yes how is it differant from BW transport process?
Thanks a lot
PadmaHi,
You can't transport data mart datasources. You have to regenerate them in every system.
Cheers,
Kedar -
Hi guys,
I’m new to Mac and have a MacBook Pro Lion OS (10.6.8 I think !!!) with Parallels 7 (Windows 7) installed. Can someone please tell me what is the best software program that I can use to read, write and modify data / files on external HD (NTFS format) from the Mac OS ? I heard of Paragon and Tuxera NTFS. Are they free ? Are they good ? Are there any other software programs out there ? I heard that some people have issues with Paragon.
Thanks.Your best bet would be to take the drive to the oldest/compatible with that drive Windows PC and grab the files off, right click and format it exFAT (XP users can download exFAT from Microsoft) and then put the files back on.
Mac's can read and write all Windows files formats except write to NTFS (and in some cases not read) so if you can change the format of the drive to exFAT (all data has to be remove first) then you will have a drive that doesn't require paid third party NTFS software (a license fee goes to Microsoft) for updates.
Also it's one less hassle to deal with too.
.Drives, partitions, formatting w/Mac's + PC's -
How can i import the data from multiple sources into single rpd in obiee11g
how can i import the data from multiple sources into single rpd in obiee11g
Hi,
to import from multiple data sources, first configure ODBC connections for respective data sources. then you can import data from multiple data sources. When you import the data, a connection pool will create automatically.
tnx -
FDMEE - Write back - No Data and No Errors
I am trying to Write Back from Planning 11.1.2.3.500 to EBS R12 using FDMEE 11.1.2.3.530. There is data in Planning, and when I execute an Import it processes successfully, but the Transform Data Process step has a warning symbol and there is no data in the grid in the Write Back Workbnch.
Here is the log:
2015-03-12 16:55:58,845 INFO [AIF]: FDMEE Process Start, Process ID: 265
2015-03-12 16:55:58,845 INFO [AIF]: FDMEE Logging Level: 5
2015-03-12 16:55:58,846 INFO [AIF]: FDMEE Log File: \\Vmhodrxeap13\fdmee\outbox\logs\RXFin_265.log
2015-03-12 16:55:58,846 INFO [AIF]: User:wilsonp
2015-03-12 16:55:58,846 INFO [AIF]: Location:RXFin_EBS_PL (Partitionkey:5)
2015-03-12 16:55:58,847 INFO [AIF]: Period Name:NA (Period Key:null)
2015-03-12 16:55:58,847 INFO [AIF]: Category Name:NA (Category key:null)
2015-03-12 16:55:58,847 INFO [AIF]: Rule Name:Test_1 (Rule ID:10)
2015-03-12 16:56:00,465 INFO [AIF]: FDM Version: 11.1.2.3.530
2015-03-12 16:56:00,465 INFO [AIF]: Jython Version: 2.5.1 (Release_2_5_1:6813, Sep 26 2009, 13:47:54)
[Oracle JRockit(R) (Oracle Corporation)]
2015-03-12 16:56:00,466 INFO [AIF]: Java Platform: java1.6.0_37
2015-03-12 16:56:00,466 INFO [AIF]: Log File Encoding: UTF-8
2015-03-12 16:56:01,904 DEBUG [AIF]: CommWb.importData - START
2015-03-12 16:56:01,908 DEBUG [AIF]: CommWb.getRuleInfo - START
2015-03-12 16:56:01,911 DEBUG [AIF]:
SELECT wr.RULE_ID
,wr.RULE_NAME
,wr.PARTITIONKEY
,ss.SOURCE_SYSTEM_ID
,ss.SOURCE_SYSTEM_TYPE
,CASE
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'EBS%' THEN 'N'
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'PS%' THEN 'N'
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'FUSION%' THEN 'N'
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'FILE%' THEN 'N'
ELSE 'Y'
END SOURCE_ADAPTER_FLAG
,imp.IMPSOURCECOAID SOURCE_COA_ID
,COALESCE(wr.SOURCE_LEDGER_ID,0) SOURCE_LEDGER_ID
,app.APPLICATION_ID
,app.TARGET_APPLICATION_NAME
,app.TARGET_APPLICATION_TYPE
,wr.PLAN_TYPE
,CASE wr.PLAN_TYPE
WHEN 'PLAN1' THEN 1
WHEN 'PLAN2' THEN 2
WHEN 'PLAN3' THEN 3
WHEN 'PLAN4' THEN 4
WHEN 'PLAN5' THEN 5
ELSE 0
END PLAN_NUMBER
,wl.POST_BY_YEAR
,wl.LEDGER_GROUP
,wl.LEDGER
,wl.GL_BUDGET_SCENARIO
,wl.GL_BUDGET_ORG
,wl.GL_BUDGET_VERSION
,wl.JE_CATEGORY
,wl.JE_SOURCE
,wl.CREATE_JOURNAL_FLAG
,wl.EXECUTION_MODE
,wl.IMPORT_FROM_SOURCE_FLAG
,wl.IMPORT_FROM_SOURCE_FLAG RECALCULATE_FLAG
,wl.EXPORT_TO_TARGET_FLAG
,wl.AS_OF_DATE
,wr.DP_MEMBER_NAME
,wl.KK_TRAN_ID
,wl.KK_SOURCE_TRAN
,wl.KK_BUDG_TRANS_TYPE
,wl.KK_ACCOUNTING_DT
,wl.KK_GEN_PARENT
,wl.KK_DEFAULT_EE
,wl.KK_PARENT_ENT_TYPE
,CASE lg.LEDGER_TEMPLATE
WHEN 'COMMITMENT' THEN 'Y'
ELSE 'N'
END KK_FLAG
,CASE
WHEN (ss.SOURCE_SYSTEM_TYPE LIKE 'PS%' AND wl.KK_SOURCE_TRAN = 'HYP_CHECK') THEN 'Y'
ELSE 'N'
END KK_CHECK_FLAG
,p.LAST_UPDATED_BY
,p.AIF_WEB_SERVICE_URL WEB_SERVICE_URL
FROM AIF_PROCESSES p
INNER JOIN AIF_WRITEBACK_LOADS wl
ON wl.LOADID = p.PROCESS_ID
INNER JOIN AIF_WRITEBACK_RULES wr
ON wr.RULE_ID = wl.RULE_ID
INNER JOIN TPOVPARTITION part
ON part.PARTITIONKEY = wr.PARTITIONKEY
INNER JOIN TBHVIMPGROUP imp
ON imp.IMPGROUPKEY = part.PARTIMPGROUP
INNER JOIN AIF_SOURCE_SYSTEMS ss
ON ss.SOURCE_SYSTEM_ID = part.PARTSOURCESYSTEMID
INNER JOIN AIF_TARGET_APPLICATIONS app
ON app.APPLICATION_ID = part.PARTTARGETAPPLICATIONID
LEFT OUTER JOIN AIF_COA_LEDGERS l
ON l.SOURCE_SYSTEM_ID = part.PARTSOURCESYSTEMID
AND l.SOURCE_LEDGER_ID = wr.SOURCE_LEDGER_ID
LEFT OUTER JOIN AIF_PS_SET_CNTRL_REC_STG scr
ON scr.SOURCE_SYSTEM_ID = l.SOURCE_SYSTEM_ID
AND scr.SETCNTRLVALUE = l.SOURCE_LEDGER_NAME
AND scr.RECNAME = 'LED_GRP_TBL'
LEFT OUTER JOIN AIF_PS_LED_GRP_TBL_STG lg
ON lg.SOURCE_SYSTEM_ID = scr.SOURCE_SYSTEM_ID
AND lg.SETID = scr.SETID
AND lg.LEDGER_GROUP = wr.LEDGER_GROUP
WHERE p.PROCESS_ID = 265
2015-03-12 16:56:01,914 DEBUG [AIF]:
SELECT wld.DIMENSION_NAME
,wld.FILTER_CONDITION
,app.TARGET_APPLICATION_NAME
FROM AIF_WRITEBACK_LOAD_DTLS wld
INNER JOIN AIF_WRITEBACK_LOADS wl
ON wl.LOADID = wld.LOADID
INNER JOIN AIF_WRITEBACK_RULES wr
ON wr.RULE_ID = wl.RULE_ID
INNER JOIN TPOVPARTITION part
ON part.PARTITIONKEY = wr.PARTITIONKEY
INNER JOIN AIF_TARGET_APPLICATIONS app
ON app.APPLICATION_ID = part.PARTTARGETAPPLICATIONID
INNER JOIN AIF_TARGET_APPL_DIMENSIONS adim
ON adim.APPLICATION_ID = app.APPLICATION_ID
AND adim.TARGET_DIMENSION_NAME = wld.DIMENSION_NAME
AND adim.TARGET_DIMENSION_CLASS_NAME = 'Scenario'
WHERE wld.LOADID = 265
2015-03-12 16:56:01,916 DEBUG [AIF]:
SELECT COALESCE(pca.CATKEY, pc.CATKEY) CATKEY
FROM TPOVCATEGORY pc
LEFT OUTER JOIN TPOVCATEGORYADAPTOR pca
ON pca.INTSYSTEMKEY = 'RXFin'
AND pca.CATTARGET = NULL
WHERE pc.CATTARGET = NULL
2015-03-12 16:56:01,918 DEBUG [AIF]:
SELECT acks.source_segment_column_name DIMNAME, ss.source_system_type SOURCE_SYSTEM_TYPE
FROM AIF_PROCESSES p
INNER JOIN AIF_WRITEBACK_LOADS wl
ON wl.LOADID = p.PROCESS_ID
INNER JOIN AIF_WRITEBACK_RULES wr
ON wr.RULE_ID = wl.RULE_ID
INNER JOIN TPOVPARTITION part
ON part.PARTITIONKEY = wr.PARTITIONKEY
INNER JOIN AIF_CB_KEY_SEGMENTS acks
ON acks.control_budget_id = wr.SOURCE_LEDGER_ID
AND acks.source_system_id = part.PARTSOURCESYSTEMID
INNER JOIN AIF_SOURCE_SYSTEMS ss
ON ss.SOURCE_SYSTEM_ID = part.PARTSOURCESYSTEMID
WHERE p.PROCESS_ID = 265
2015-03-12 16:56:01,919 DEBUG [AIF]:
SELECT lv.LOOKUP_DISPLAY_CODE DIMNAME
,wld.TEMP_COLUMN_NAME
,cs.COA_SEGMENT_NAME
,cs.VALUE_SET_ID
,cs.ACCOUNT_TYPE_FLAG
FROM AIF_COA_SEGMENTS cs
INNER JOIN TPOVPARTITION tpp
ON tpp.PARTITIONKEY = 5
INNER JOIN AIF_LOOKUP_TYPES lt
ON lt.SOURCE_SYSTEM_ID = 0
AND lt.LOOKUP_TYPE = 'AIF_SEGMENT_COLUMN_MAP'
INNER JOIN AIF_LOOKUP_VALUES lv
ON lv.LOOKUP_TYPE_ID = lt.LOOKUP_TYPE_ID
AND lv.LOOKUP_CODE = cs.COA_SEGMENT_NAME
LEFT OUTER JOIN TBHVIMPITEMERPI tiie
ON tiie.IMPGROUPKEY = tpp.PARTIMPGROUP
AND tiie.IMPMAPTYPE = 'EPM'
AND tiie.IMPSOURCECOALINEID1 = cs.COA_LINE_ID
LEFT OUTER JOIN AIF_WRITEBACK_LOAD_DTLS wld
ON wld.LOADID = 265
AND wld.DIMENSION_NAME = tiie.IMPDIMNAME
WHERE cs.SOURCE_SYSTEM_ID = 4
AND cs.SOURCE_COA_ID = 50348
ORDER BY lv.LOOKUP_DISPLAY_CODE
2015-03-12 16:56:01,922 DEBUG [AIF]: CommWb.getRuleInfo - END
2015-03-12 16:56:01,924 DEBUG [AIF]: AIFUtil.callOdiServlet - START
2015-03-12 16:56:01,948 DEBUG [AIF]: cloudMode: NONE
2015-03-12 16:56:01,949 DEBUG [AIF]: GlobalUserForAppAccess from Profile: null
2015-03-12 16:56:01,951 INFO [AIF]: Resolved user name for application access: wilsonp
2015-03-12 16:56:02,450 INFO [AIF]: [HPLService] Info: Cube Name: Finance
2015-03-12 16:56:02,451 INFO [AIF]: [HPLService] Info: Importing data from RXFin:Finance...
2015-03-12 16:57:31,732 INFO [AIF]: [HPLService] Info: Data import complete
2015-03-12 16:57:31,738 INFO [AIF]: [HPLService] Info: [importWritebackData:265] END (true)
2015-03-12 16:57:31,747 DEBUG [AIF]: AIFUtil.callOdiServlet - END
2015-03-12 16:57:31,747 DEBUG [AIF]:
SELECT STATUS
FROM AIF_PROCESS_DETAILS
WHERE PROCESS_ID = 265
AND ENTITY_TYPE = 'PROCESS_WB_IMP'
2015-03-12 16:57:31,752 DEBUG [AIF]: CommWb.insertPeriods - START
2015-03-12 16:57:31,755 DEBUG [AIF]:
SELECT DIMENSION_NAME
,FILTER_CONDITION
FROM AIF_WRITEBACK_LOAD_DTLS
WHERE LOADID = 265
AND COLUMN_TYPE = 'Year'
2015-03-12 16:57:31,758 DEBUG [AIF]: commAppPeriodMappingExists: N
2015-03-12 16:57:31,758 DEBUG [AIF]:
INSERT INTO AIF_PROCESS_PERIODS (
PROCESS_ID
,PERIODKEY
,PERIOD_ID
,ADJUSTMENT_PERIOD_FLAG
,GL_PERIOD_YEAR
,GL_PERIOD_NUM
,GL_PERIOD_NAME
,GL_PERIOD_CODE
,GL_EFFECTIVE_PERIOD_NUM
,YEARTARGET
,PERIODTARGET
,IMP_ENTITY_TYPE
,IMP_ENTITY_ID
,IMP_ENTITY_NAME
,TRANS_ENTITY_TYPE
,TRANS_ENTITY_ID
,TRANS_ENTITY_NAME
,PRIOR_PERIOD_FLAG
,SOURCE_LEDGER_ID
SELECT q.PROCESS_ID
,q.PERIODKEY
,NULL PERIOD_ID
,'N' ADJUSTMENT_PERIOD_FLAG
,0 GL_PERIOD_YEAR
,'0' GL_PERIOD_CODE
,'0' GL_PERIOD_NAME
,q.ENTITY_NAME_ORDER GL_PERIOD_NUM
,q.ENTITY_NAME_ORDER GL_EFFECTIVE_PERIOD_NUM
,q.YEARTARGET
,q.PERIODTARGET
,'PROCESS_WB_IMP' IMP_ENTITY_TYPE
,NULL IMP_ENTITY_ID
,p.PERIODDESC IMP_ENTITY_NAME
,'PROCESS_WB_TRANS' TRANS_ENTITY_TYPE
,NULL TRANS_ENTITY_ID
,p.PERIODDESC TRANS_ENTITY_NAME
,'N' PRIOR_PERIOD_FLAG
,NULL SOURCE_LEDGER_ID
FROM (
SELECT PROCESS_ID
,MIN(PERIODKEY) PERIODKEY
,PERIODTARGET
,YEARTARGET
,ENTITY_NAME_ORDER
FROM (
SELECT wld.LOADID PROCESS_ID
,pp.PERIODKEY PERIODKEY
,pp.PERIODTARGET PERIODTARGET
,pp.YEARTARGET YEARTARGET
,CASE
WHEN (INSTR(UPPER(wld.TEMP_COLUMN_NAME),'AMOUNT',1) = 1) THEN
CAST(SUBSTR(wld.TEMP_COLUMN_NAME,7,LENGTH(wld.TEMP_COLUMN_NAME)) AS NUMERIC(15,0))
ELSE 0
END ENTITY_NAME_ORDER
FROM (
AIF_WRITEBACK_LOAD_DTLS wld
INNER JOIN TPOVPERIOD_FLAT_V pp
ON pp.PERIODTARGET = wld.DIMENSION_NAME
AND pp.YEARTARGET = 'FY14')
WHERE wld.LOADID = 265
AND wld.COLUMN_TYPE = 'DATA'
) query
GROUP BY PROCESS_ID
,PERIODTARGET
,YEARTARGET
,ENTITY_NAME_ORDER
) q
,TPOVPERIOD p
WHERE p.PERIODKEY = q.PERIODKEY
ORDER BY p.PERIODKEY
2015-03-12 16:57:31,764 DEBUG [AIF]: CommWb.insertPeriods - END
2015-03-12 16:57:31,772 DEBUG [AIF]: COMM GL Writeback Load Data - Load TDATASEGW - START
2015-03-12 16:57:31,774 DEBUG [AIF]: CommWb.getLedgerListAndMap - START
2015-03-12 16:57:31,775 DEBUG [AIF]: CommWb.getLedgerSQL - START
2015-03-12 16:57:31,775 DEBUG [AIF]: CommWb.getLedgerSQL - END
2015-03-12 16:57:31,775 DEBUG [AIF]:
SELECT l.SOURCE_LEDGER_ID
,l.SOURCE_LEDGER_NAME
,l.FUNCTIONAL_CURRENCY
,l.CALENDAR_ID
,'0' SETID
,l.PERIOD_TYPE
FROM AIF_WRITEBACK_LOADS wl
,AIF_WRITEBACK_RULES wr
,TPOVPARTITION part
,AIF_COA_LEDGERS l
WHERE wl.LOADID = 265
AND wr.RULE_ID = wl.RULE_ID
AND part.PARTITIONKEY = wr.PARTITIONKEY
AND l.SOURCE_SYSTEM_ID = part.PARTSOURCESYSTEMID
AND l.SOURCE_LEDGER_ID = wr.SOURCE_LEDGER_ID
2015-03-12 16:57:31,777 DEBUG [AIF]: CommWb.getLedgerListAndMap - END
2015-03-12 16:57:31,778 DEBUG [AIF]:
SELECT acks.source_segment_column_name DIMNAME, ss.source_system_type SOURCE_SYSTEM_TYPE
FROM AIF_PROCESSES p
INNER JOIN AIF_WRITEBACK_LOADS wl
ON wl.LOADID = p.PROCESS_ID
INNER JOIN AIF_WRITEBACK_RULES wr
ON wr.RULE_ID = wl.RULE_ID
INNER JOIN TPOVPARTITION part
ON part.PARTITIONKEY = wr.PARTITIONKEY
INNER JOIN AIF_CB_KEY_SEGMENTS acks
ON acks.control_budget_id = wr.SOURCE_LEDGER_ID
AND acks.source_system_id = part.PARTSOURCESYSTEMID
INNER JOIN AIF_SOURCE_SYSTEMS ss
ON ss.SOURCE_SYSTEM_ID = part.PARTSOURCESYSTEMID
WHERE p.PROCESS_ID = 265
2015-03-12 16:57:31,779 DEBUG [AIF]:
SELECT lv.LOOKUP_DISPLAY_CODE DIMNAME
,wld.TEMP_COLUMN_NAME
,cs.COA_SEGMENT_NAME
,cs.VALUE_SET_ID
,cs.ACCOUNT_TYPE_FLAG
FROM AIF_COA_SEGMENTS cs
INNER JOIN TPOVPARTITION tpp
ON tpp.PARTITIONKEY = 5
INNER JOIN AIF_LOOKUP_TYPES lt
ON lt.SOURCE_SYSTEM_ID = 0
AND lt.LOOKUP_TYPE = 'AIF_SEGMENT_COLUMN_MAP'
INNER JOIN AIF_LOOKUP_VALUES lv
ON lv.LOOKUP_TYPE_ID = lt.LOOKUP_TYPE_ID
AND lv.LOOKUP_CODE = cs.COA_SEGMENT_NAME
LEFT OUTER JOIN TBHVIMPITEMERPI tiie
ON tiie.IMPGROUPKEY = tpp.PARTIMPGROUP
AND tiie.IMPMAPTYPE = 'EPM'
AND tiie.IMPSOURCECOALINEID1 = cs.COA_LINE_ID
LEFT OUTER JOIN AIF_WRITEBACK_LOAD_DTLS wld
ON wld.LOADID = 265
AND wld.DIMENSION_NAME = tiie.IMPDIMNAME
WHERE cs.SOURCE_SYSTEM_ID = 4
AND cs.SOURCE_COA_ID = 50348
ORDER BY lv.LOOKUP_DISPLAY_CODE
2015-03-12 16:57:31,783 DEBUG [AIF]: CommWb.getPovList - START
2015-03-12 16:57:31,784 DEBUG [AIF]:
SELECT wld.DIMENSION_NAME
,wld.FILTER_CONDITION
,app.TARGET_APPLICATION_NAME
FROM AIF_WRITEBACK_LOAD_DTLS wld
INNER JOIN AIF_WRITEBACK_LOADS wl
ON wl.LOADID = wld.LOADID
INNER JOIN AIF_WRITEBACK_RULES wr
ON wr.RULE_ID = wl.RULE_ID
INNER JOIN TPOVPARTITION part
ON part.PARTITIONKEY = wr.PARTITIONKEY
INNER JOIN AIF_TARGET_APPLICATIONS app
ON app.APPLICATION_ID = part.PARTTARGETAPPLICATIONID
INNER JOIN AIF_TARGET_APPL_DIMENSIONS adim
ON adim.APPLICATION_ID = app.APPLICATION_ID
AND adim.TARGET_DIMENSION_NAME = wld.DIMENSION_NAME
AND adim.TARGET_DIMENSION_CLASS_NAME = 'Scenario'
WHERE wld.LOADID = 265
2015-03-12 16:57:31,785 DEBUG [AIF]:
SELECT COALESCE(pca.CATKEY, pc.CATKEY) CATKEY
FROM TPOVCATEGORY pc
LEFT OUTER JOIN TPOVCATEGORYADAPTOR pca
ON pca.INTSYSTEMKEY = 'RXFin'
AND pca.CATTARGET = NULL
WHERE pc.CATTARGET = NULL
2015-03-12 16:57:31,787 DEBUG [AIF]:
SELECT PARTITIONKEY
,PARTNAME
,CATKEY
,CATNAME
,PERIODKEY
,COALESCE(PERIODDESC, TO_CHAR(PERIODKEY,'YYYY-MM-DD HH24:MI:SS')) PERIODDESC
,RULE_ID
,RULE_NAME
,YEARTARGET
FROM (
SELECT DISTINCT wr.PARTITIONKEY
,part.PARTNAME
,cat.CATKEY
,cat.CATNAME
,pprd.PERIODKEY
,pp.PERIODDESC
,wr.RULE_ID
,wr.RULE_NAME
,pprd.YEARTARGET
FROM AIF_WRITEBACK_LOADS wl
INNER JOIN AIF_WRITEBACK_RULES wr
ON wr.RULE_ID = wl.RULE_ID
INNER JOIN TPOVPARTITION part
ON part.PARTITIONKEY = wr.PARTITIONKEY
INNER JOIN TPOVCATEGORY cat
ON cat.CATKEY = NULL
INNER JOIN AIF_PROCESS_PERIODS pprd
ON pprd.PROCESS_ID = wl.LOADID
LEFT OUTER JOIN TPOVPERIOD pp
ON pp.PERIODKEY = pprd.PERIODKEY
WHERE wl.LOADID = 265
) q
ORDER BY PARTITIONKEY
,CATKEY
,PERIODKEY
,RULE_ID
2015-03-12 16:57:31,788 DEBUG [AIF]: CommWb.getPovList - END
2015-03-12 16:57:31,789 DEBUG [AIF]: COMM GL Writeback Load Data - Load TDATASEGW - END
2015-03-12 16:57:31,789 DEBUG [AIF]: CommWb.importData - END
2015-03-12 16:57:31,879 DEBUG [AIF]: CommWb.insertTransProcessDetails - START
2015-03-12 16:57:31,880 DEBUG [AIF]:
INSERT INTO AIF_PROCESS_DETAILS (
PROCESS_ID
,ENTITY_TYPE
,ENTITY_ID
,ENTITY_NAME
,ENTITY_NAME_ORDER
,TARGET_TABLE_NAME
,EXECUTION_START_TIME
,EXECUTION_END_TIME
,RECORDS_PROCESSED
,STATUS
,LAST_UPDATED_BY
,LAST_UPDATE_DATE
SELECT PROCESS_ID
,ENTITY_TYPE
,ENTITY_ID
,ENTITY_NAME
,ENTITY_NAME_ORDER
,'TDATASEGW' TARGET_TABLE_NAME
,CURRENT_TIMESTAMP EXECUTION_START_TIME
,NULL EXECUTION_END_TIME
,0 RECORDS_PROCESSED
,'PENDING' STATUS
,'wilsonp' LAST_UPDATED_BY
,CURRENT_TIMESTAMP LAST_UPDATE_DATE
FROM (
SELECT PROCESS_ID
,TRANS_ENTITY_TYPE ENTITY_TYPE
,MIN(TRANS_ENTITY_ID) ENTITY_ID
,TRANS_ENTITY_NAME ENTITY_NAME
,MIN(GL_EFFECTIVE_PERIOD_NUM) ENTITY_NAME_ORDER
FROM AIF_PROCESS_PERIODS
WHERE PROCESS_ID = 265
AND PRIOR_PERIOD_FLAG = 'N'
GROUP BY PROCESS_ID
,TRANS_ENTITY_TYPE
,TRANS_ENTITY_NAME
) q
ORDER BY ENTITY_NAME_ORDER
2015-03-12 16:57:31,887 DEBUG [AIF]: CommWb.insertTransProcessDetails - END
2015-03-12 16:57:31,891 DEBUG [AIF]:
DELETE FROM TDATAMAP_T
WHERE LOADID < 265
AND EXISTS (
SELECT 1
FROM AIF_PROCESSES p
WHERE p.RULE_ID = 10
AND p.PROCESS_ID = TDATAMAP_T.LOADID
2015-03-12 16:57:31,901 DEBUG [AIF]:
DELETE FROM AIF_WRITEBACK_ESS_DATA_T
WHERE LOADID < 265
AND EXISTS (
SELECT 1
FROM AIF_PROCESSES p
WHERE p.RULE_ID = 10
AND p.PROCESS_ID = AIF_WRITEBACK_ESS_DATA_T.LOADID
2015-03-12 16:57:33,060 DEBUG [AIF]:
DELETE FROM AIF_PROCESS_PERIODS
WHERE PROCESS_ID < 265
AND EXISTS (
SELECT 1
FROM AIF_PROCESSES p
WHERE p.RULE_ID = 10
AND p.PROCESS_ID = AIF_PROCESS_PERIODS.PROCESS_ID
2015-03-12 16:57:33,066 DEBUG [AIF]:
DELETE FROM TDATASEGW
WHERE LOADID < 265
AND EXISTS (
SELECT 1
FROM AIF_PROCESSES p
WHERE p.RULE_ID = 10
AND p.PROCESS_ID = TDATASEGW.LOADID
2015-03-12 16:57:33,069 DEBUG [AIF]: CommMap.loadTDATAMAP_T - START
2015-03-12 16:57:33,071 DEBUG [AIF]: CommData.getMapPartitionKeyandName - START
2015-03-12 16:57:33,071 DEBUG [AIF]:
SELECT COALESCE(part_parent.PARTITIONKEY, part.PARTITIONKEY) PARTITIONKEY
,COALESCE(part_parent.PARTNAME, part.PARTNAME) PARTNAME
FROM TPOVPARTITION part
LEFT OUTER JOIN TPOVPARTITION part_parent
ON part_parent.PARTITIONKEY = part.PARTPARENTKEY
WHERE part.PARTITIONKEY = 5
2015-03-12 16:57:33,073 DEBUG [AIF]: CommData.getMapPartitionKeyandName - END
2015-03-12 16:57:33,073 DEBUG [AIF]:
INSERT INTO TDATAMAP_T (
LOADID
,DATAKEY
,PARTITIONKEY
,DIMNAME
,SRCKEY
,SRCDESC
,TARGKEY
,WHERECLAUSETYPE
,WHERECLAUSEVALUE
,CHANGESIGN
,SEQUENCE
,VBSCRIPT
,TDATAMAPTYPE
,SYSTEM_GENERATED_FLAG
,RULE_ID
SELECT 265
,DATAKEY
,5 PARTITIONKEY
,DIMNAME
,SRCKEY
,SRCDESC
,CASE WHEN (TDATAMAPTYPE = 'EPM' AND TARGKEY = '<BLANK>') THEN ' ' ELSE TARGKEY END
,WHERECLAUSETYPE
,CASE WHEN (TDATAMAPTYPE = 'EPM' AND WHERECLAUSEVALUE = '<BLANK>') THEN ' ' ELSE WHERECLAUSEVALUE END
,CHANGESIGN
,SEQUENCE
,VBSCRIPT
,TDATAMAPTYPE
,SYSTEM_GENERATED_FLAG
,RULE_ID
FROM TDATAMAP tdm
WHERE PARTITIONKEY = 5
AND (RULE_ID IS NULL OR RULE_ID = 10)
AND (
TDATAMAPTYPE = 'EPM'
OR (
TDATAMAPTYPE = 'MULTIDIM'
AND EXISTS (
SELECT 1
FROM TDATAMAP multidim
WHERE multidim.PARTITIONKEY = tdm.PARTITIONKEY
AND multidim.TDATAMAPTYPE = 'EPM'
AND multidim.DATAKEY = tdm.TARGKEY
2015-03-12 16:57:33,077 DEBUG [AIF]: Number of Rows inserted into TDATAMAP_T: 14
2015-03-12 16:57:33,122 DEBUG [AIF]: CommMap.updateTDATASEG_T_TDATASEGW - START
2015-03-12 16:57:33,123 DEBUG [AIF]:
SELECT DIMNAME
,SRCKEY
,TARGKEY
,WHERECLAUSETYPE
,WHERECLAUSEVALUE
,CHANGESIGN
,SEQUENCE
,DATAKEY
,MAPPING_TYPE
,CASE WHEN (RULE_ID IS NOT NULL) THEN 'Y' ELSE 'N' END IS_RULE_MAP
FROM (
SELECT DISTINCT tdm.DIMNAME
,tdm.RULE_ID
,NULL SRCKEY
,NULL TARGKEY
,tdm.WHERECLAUSETYPE
,tdm.WHERECLAUSEVALUE
,NULL CHANGESIGN
,1 SEQUENCE
,COALESCE(tdm.SYSTEM_GENERATED_FLAG,'N') SYSTEM_GENERATED_FLAG
,NULL DATAKEY
,CASE
WHEN tdm.WHERECLAUSETYPE IS NULL THEN 1
ELSE 3
END MAPPING_TYPE
FROM TDATAMAP_T tdm
WHERE tdm.LOADID = 265
AND tdm.PARTITIONKEY = 5
AND tdm.TDATAMAPTYPE = 'EPM'
AND (tdm.RULE_ID IS NULL OR tdm.RULE_ID = 10)
AND tdm.WHERECLAUSETYPE IS NULL
UNION ALL
SELECT tdm.DIMNAME
,tdm.RULE_ID
,tdm.SRCKEY
,tdm.TARGKEY
,tdm.WHERECLAUSETYPE
,tdm.WHERECLAUSEVALUE
,tdm.CHANGESIGN
,CASE tpp.PARTSEQMAP
WHEN 0 THEN CASE
WHEN (tdm.WHERECLAUSETYPE = 'BETWEEN') THEN 2
WHEN (tdm.WHERECLAUSETYPE = 'IN') THEN 3
WHEN (tdm.WHERECLAUSETYPE = 'MULTIDIM') THEN 4
WHEN (tdm.WHERECLAUSETYPE = 'LIKE') THEN 5
ELSE 0
END
ELSE tdm.SEQUENCE
END SEQUENCE
,COALESCE(tdm.SYSTEM_GENERATED_FLAG,'N') SYSTEM_GENERATED_FLAG
,tdm.DATAKEY
,CASE
WHEN tdm.WHERECLAUSETYPE IS NULL THEN 1
ELSE 3
END MAPPING_TYPE
FROM TDATAMAP_T tdm
INNER JOIN TPOVPARTITION tpp
ON tpp.PARTITIONKEY = tdm.PARTITIONKEY
WHERE tdm.LOADID = 265
AND tdm.PARTITIONKEY = 5
AND tdm.TDATAMAPTYPE = 'EPM'
AND (tdm.RULE_ID IS NULL OR tdm.RULE_ID = 10)
AND tdm.WHERECLAUSETYPE IN ('BETWEEN','IN','MULTIDIM','LIKE')
) q
ORDER BY DIMNAME
,SEQUENCE
,RULE_ID
,SYSTEM_GENERATED_FLAG
,SRCKEY
2015-03-12 16:57:33,129 DEBUG [AIF]: CommMap.updateTDATASEG_T_TDATASEGW - END
2015-03-12 16:57:33,138 DEBUG [AIF]: CommMap.loadTDATAMAPSEG_TDATASEG - START
2015-03-12 16:57:33,139 DEBUG [AIF]: CommMap.loadTDATAMAPSEG_TDATASEG - END
2015-03-12 16:57:33,217 DEBUG [AIF]: CommMap.validateData - START
2015-03-12 16:57:33,218 DEBUG [AIF]: CommMap.validateData - END
2015-03-12 16:57:33,311 DEBUG [AIF]: Comm.finalizeProcess - START
2015-03-12 16:57:33,312 DEBUG [AIF]: CommWb.updateRuleStatus - START
2015-03-12 16:57:33,313 DEBUG [AIF]:
UPDATE AIF_WRITEBACK_RULES
SET STATUS = CASE 'SUCCESS'
WHEN 'SUCCESS' THEN
CASE (
SELECT COUNT(*)
FROM AIF_PROCESS_DETAILS pd
WHERE pd.PROCESS_ID = 265
AND pd.STATUS IN ('FAILED','WARNING')
WHEN 0 THEN 'SUCCESS'
ELSE (
SELECT MIN(pd.STATUS)
FROM AIF_PROCESS_DETAILS pd
WHERE pd.PROCESS_ID = 265
AND pd.STATUS IN ('FAILED','WARNING')
END
ELSE 'SUCCESS'
END
WHERE RULE_ID = 10
2015-03-12 16:57:33,317 DEBUG [AIF]: CommWb.updateRuleStatus - END
2015-03-12 16:57:33,318 DEBUG [AIF]: Comm.updateProcess - START
2015-03-12 16:57:33,322 DEBUG [AIF]: Comm.updateProcess - END
2015-03-12 16:57:33,323 INFO [AIF]: FDMEE Process End, Process ID: 265
I have created the following write-back mappings:
UD1,*,*,Z_Catch All Entity
UD1,CO_*,*,"Remove prefix ""Co_"""
UD2,*,*,Z_Catch All Account
UD2,ACC_*,*,Remove ACC_
UD3,*,*,Local Account Pass Through
UD4,*,*,Z_Catch All MC
UD4,CC_*,*,"Remove prefix ""CC_"""
UD5,*,*,Z_Catch All Event Edition
UD5,SY*,*,"Remove prefix ""SY"""
UD6,*,*,Activity pass through
UD7,*,*,Z_Catch All Product
UD7,PRD_*_I,*,Remove prefix PRD
UD8,*,*,ICP pass through
UD9,*,*,Spare pass through
Which is basically to strip off some prefixes, or just pass through the values.
In the Import Format I have the following Write Back Mapping:
Source Dimension Source Segment
Account Account
Activity Activity
Entity Entity
SessionYear Event Edition
Intercompany
Local Account
ManagementCentre Management Centre
Product Product
Spare
For my Write Back Rule I select the Location: RXFin_EBS_PL, the Primary Ledger in EBS
Period: Dec-14
Category: Budget - The name of the scenario in Planning.
Source: RX_EBS
Target: RXFIN
Plan Type: Finance
I have Source Filters to limit the amount of data it tries to pull in.
Account: @Relative("Profit & Loss",0)
Entity: "CO_0052"
Activit:y @Relative("AllActivities",0)
Period: @Relative("YearTotalPlan",0)
Version: "FinalVersion"
Year: "FY14"
And for Budget: RX RF1 2014
Budget Organization: RX Budget Org - Though I am not sure what this does.
I execute Import from Source, and it process for several minutes, and then nothing is in the Workbench.
There is data in Planning as far as I can see, and with no errors I am at a loss on what to investigate.
Can anyone assist?
Thanks,Do you have any luck if you simplify the source filter to try and be as specific as possible and only bring in as few rows as possible from a known good intersection as a sanity check?
Regards
Craig -
Reading the data from XML Source
Hi
i want to read the data from XML source file, and update the transaction information to another XML file. how we can do this in ADF, please help
Thanks
nidhiyou may use normal Java API to do that
http://www.javablogging.com/read-and-write-xml/ -
How to design extractor can have only current Fiscal Year data.
Dear Experts,
I require a full upload DSO , which can have only current fiscal year data . I have created view putting there certain conditions for minimising the data size. How i can restrict it , to have only current fisacal year data for YTD reports. It would be daily uploaded.
Should i created Data Source based on function ?
Regards,
Anand Mehrotra.while extracting the data from source you can set the filters at the info package level so that it will not extract the previous years data.
In the data selection tab in Info package enter the current fiscal year and schedule the IP it will extract only the current year data.
next execute the DTP to load data till DSO.
the data is restricted in the infopackage level before entering to PSA so there is no need to write any routines in the DSO level.
Edited by: prashanthk on Feb 1, 2011 12:45 PM -
Validate data in r/3 while upload frm external source to bw
Hi
My requirement is while uploading data(2lacs to 5lacs of records) from external source file(xls or notpad) to BW i have to check whether that data is present in R/3 system or not.
can anyone suggest the best way for validation(any FMs exist) for performance sake
and also provide the FMs all which gives the best performance.u need to explan it in more detail...
process
Document number
matching keys with SAP for validating records
etc
etc
regards
madan -
FCP error message: Unable to read the data on your source tape!
Hey,
So I have been editing on the MacPro now for over a year and I have never seen this message before...the message reads:
_*"Capture encountered a problem reading the data on your source tape. This could be due to a problem with the tape."*_
Has anyone got this message and could help me out please? I need to edit these tapes A.S.A.P.
Thank You,
Erickwhy would it start a month later giving me problems instead of when it was updated?
That is a good point. Anything change since the time it worked to when it didn't? How about trying another firewire cable. Those can go south without notice, might cause this.
Shane -
Data Reconciliation Data Sources in Business Content
Can you tell me where we can find these data sources and explain me how to use them? Do we need to define infocube/ods or anything like that to load the data and use report to see the results?
Please explain me with one complete scenario.
Thanks.Data Reconciliation for data sources allows you to ensure the consistency of data has been loaded in to BI is available and used productively there.
It is based on comparision of the data loaded in to BI and the application data in the source system.You can access the data in the source system directly to perform this comparison.
The term Data Reconciliation data source is used for Data sources that are used as a reference for accessing the application data in the source directly and there fore allow you to draw comparison to the source data.
It allows you to check the integrity of the loaded data by for EXAMPLE,comparing the total of a keyfigure in the data store object with the corresponding totals that the virtual providers access directly in the Source system.
Hope it will helps you.....
Maybe you are looking for
-
Help needed! Raid degraded again!
Hi! Help needed! I hava made bootable RAID with two S-ATAII 250Gb HDD and its not working! Every now and then at bootup I get a message RAID -> DEGRADED... Must be seventh time! Rebuild takes its own time! What am I doing wrong! T: Ekku K8N Neo4 Ultr
-
Running a J2EE project on WAS at a regular interval
I am going create a EJB project to poll (or check) any change in a DB and do some business logic or mapping everytime when the DB is changed. So, my question is how I can configure this project to run at a regular interval, says 5 mins? Since my WebS
-
HT1338 my mac is running on mac os x 10.5.8 and will not update
i need help updating
-
WF_NOTIFICATION.send
Hi All, I am trying to use WF_NOTIFICATION.send for the 1st time. The requirement is that I have following piece of code in my pl/sql package which is called form the function activity of the workflow. The following code is inside the loop of the sup
-
Can you embed a jar into a webpage?
Hi, I worked out how to embed my .class file into a webpage with <applet>. Im using netbeans which also creates a jar which includes my audio files in the archive. Could I just upload the jar file and embed that into a webpage?