ABAP Dataflow or Regular Dataflow?
We can extract data from ECC system using BODS by creating regular dataflow as well as ABAP dataflow.
Are there any guidelines from SAP that we should use ABAP dataflow only?
What is advantage of using ABAP dataflow over regular dataflow? Any performance improvement?
Hi Sagar,
Regular Dataflow:
Reads data from SAP applications using regular data flows and supports tables (for small data sets only) and extractors.
ABAP Dataflow : Reads data from SAP applications using ABAP data flows and supports tables, hierarchies, extractors, and functions with scalar arguments, good for .
Based on the number of records you mentioned, I would say ABAP is your better option. Because for large amounts of data performance is generally better when you extract data using ABAP data flow.
(note: if you want to know limitations/Disadvantages related to ABAP dataflow please check this link http://www.forumtopics.com/busobj/viewtopic.php?t=210395&view=next&sid=443a727d4201219c16aba7f5e786c231
hope you understand and its help you..
Regards,
Sandeep
Similar Messages
-
Creating View for a table with parent child relation in table
I need help creating a view. It is on a base table which is a metadata table.It is usinf parent child relationship. There are four types of objects, Job, Workflow, Dataflow and ABAP dataflow. Job would be the root parent everytime. I have saved all the jobs
of the project in another table TABLE_JOB with column name JOB_NAME. Query should iteratively start from the job and search all the child nodes and then display all child with the job name. Attached are the images of base table data and expected view data
and also the excel sheet with data.Picture 1 is the sample data in base table. Picture 2 is data in the view.
Base Table
PARENT_OBJ
PAREBT_OBJ_TYPE
DESCEN_OBJ
DESCEN_OBJ_TYPE
JOB_A
JOB
WF_1
WORKFLOW
JOB_A
JOB
DF_1
DATAFLOW
WF_1
WORKFLOW
DF_2
DATAFLOW
DF_1
DATAFLOW
ADF_1
ADF
JOB_B
JOB
WF_2
WORKFLOW
JOB_B
JOB
WF_3
WORKFLOW
WF_2
WORKFLOW
DF_3
DATAFLOW
WF_3
WORKFLOW
DF_4
DATAFLOW
DF_4
DATAFLOW
ADF_2
ADF
View
Job_Name
Flow_Name
Flow_Type
Job_A
WF_1
WORKFLOW
Job_A
DF_1
DATAFLOW
Job_A
DF_2
DATAFLOW
Job_A
ADF_1
ADF
Job_B
WF_2
WORKFLOW
Job_B
WF_3
WORKFLOW
Job_B
DF_3
DATAFLOW
Job_B
DF_4
DATAFLOW
Job_B
ADF_2
ADF
I implemented the same in oracle using CONNECT_BY_ROOT and START WITH.
Regards,
MeghaI think what you need is recursive CTE
Consider your table below
create table basetable
(PARENT_OBJ varchar(10),
PAREBT_OBJ_TYPE varchar(10),
DESCEN_OBJ varchar(10),DESCEN_OBJ_TYPE varchar(10))
INSERT basetable(PARENT_OBJ,PAREBT_OBJ_TYPE,DESCEN_OBJ,DESCEN_OBJ_TYPE)
VALUES('JOB_A','JOB','WF_1','WORKFLOW'),
('JOB_A','JOB','DF_1','DATAFLOW'),
('WF_1','WORKFLOW','DF_2','DATAFLOW'),
('DF_1','DATAFLOW','ADF_1','ADF'),
('JOB_B','JOB','WF_2','WORKFLOW'),
('JOB_B','JOB','WF_3','WORKFLOW'),
('WF_2','WORKFLOW','DF_3','DATAFLOW'),
('WF_3','WORKFLOW','DF_4','DATAFLOW'),
('DF_4','DATAFLOW','ADF_2','ADF')
ie first create a UDF like below to get hierarchy recursively
CREATE FUNCTION GetHierarchy
@Object varchar(10)
RETURNS @RESULTS table
PARENT_OBJ varchar(10),
DESCEN_OBJ varchar(10),
DESCEN_OBJ_TYPE varchar(10)
AS
BEGIN
;With CTE
AS
SELECT PARENT_OBJ,DESCEN_OBJ,DESCEN_OBJ_TYPE
FROM basetable
WHERE PARENT_OBJ = @Object
UNION ALL
SELECT b.PARENT_OBJ,b.DESCEN_OBJ,b.DESCEN_OBJ_TYPE
FROM CTE c
JOIN basetable b
ON b.PARENT_OBJ = c.DESCEN_OBJ
INSERT @RESULTS
SELECT @Object,DESCEN_OBJ,DESCEN_OBJ_TYPE
FROM CTE
OPTION (MAXRECURSION 0)
RETURN
END
Then you can invoke it as below
SELECT * FROM dbo.GetHierarchy('JOB_A')
Now you need to use this for every parent obj (start obj) in view
for that create view as below
CREATE VIEW vw_Table
AS
SELECT f.*
FROM (SELECT DISTINCT PARENT_OBJ FROM basetable r
WHERE NOT EXISTS (SELECT 1
FROM basetable WHERE DESCEN_OBJ = r.PARENT_OBJ)
)b
CROSS APPLY dbo.GetHierarchy(b.PARENT_OBJ) f
GO
This will make sure it will give full hieraracy for each start object
Now just call view as below and see the output
SELECT * FROM vw_table
Output
PARENT_OBJ DESCEN_OBJ DESCEN_OBJ_TYPE
JOB_A WF_1 WORKFLOW
JOB_A DF_1 DATAFLOW
JOB_A ADF_1 ADF
JOB_A DF_2 DATAFLOW
JOB_B WF_2 WORKFLOW
JOB_B WF_3 WORKFLOW
JOB_B DF_4 DATAFLOW
JOB_B ADF_2 ADF
JOB_B DF_3 DATAFLOW
Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs -
How can I pass a value field to a global variable?
I have a doubt,
I wnat to know if is possible to pass a value field to a global variable inside of DataFlow?
That a try to do is at the time to execute a JOB on line (real time), depending of the value field that i receive, only certain DATAFLOWS will be executed depending of the value field.
I hope you can help me.
Kind regards!!!It can be done via various ways, your approach is kind of odd, though.
You are saying you have one kind if Input message, depending on the flag you want to do different transformations. The way you envision that is by loading that flag into a global variable and use different dataflows then.
Can you use different dataflows within one Realtime Job? Yes, usually the first loads an in-memory datastore.
Can you write into a global variable? Yes and No. Not directly, but you can write the value into e.g. a database and using the sql() function read it from there. But why would you. You could write it into the in-memory datastore. And then you run your n dataflows, each has a filter. So only one of these dataflows will actually process the data.
Are there other approaches? I would have a Case transform at the beginning and route the data into different queries depending on the data. So all in one dataflow. Your approach is fine for a batch dataflow where a dataflow is started, does something and then ends. But that's not how it works in Realtime! -
Selection screen dialog programming
I have used a selection screen in the dialog programming.
I have written some code in the at selection screen to display the output on a table control in another screen 1001.
my problem is that when i try to enter the multiple values for material ,
it goes to the at selection screen and displays the output again.
if i write the code in start of selection instead of at selection screen
it doesnt give any output at all.
how do i correct this error?
SELECTION-SCREEN BEGIN OF SCREEN 2000 .
SELECT-OPTIONS : s_ersda FOR MCHB-ERSDA OBLIGATORY.
SELECT-OPTIONS : s_matnr FOR MCHB-MATNR.
SELECT-OPTIONS : s_matkl FOR MARA-MATKL.
SELECT-OPTIONS : s_ferth FOR MARA-FERTH.
SELECTION-SCREEN END OF SCREEN 2000.
at selection-screen.
statements to fill the output table
screen containing table control
call screen 1001Friends,
Req for ABAP HR
ROLE: SAP HR ABAP DEVELOPER
Location: TULSA, OK
Duration: 3 + MONTHS
REQUIRED SKILLS: SAP ABAP / Dialog and regular / R/3 and HR / Workflow / smartforms
/ ALV / LSMW for HR and R/3 / Modules - MM , SD , HR, PD , PS
Start Date: 11-12-07
If interested Please send the cvs to [email protected] -
Hi All,
Please help on the SSIS issue.
i have a package.It contains Foreach loop container,Inside foreach loop container,I have one Dataflow task(Inside dataflow task,i am using Oledb source(table),I am populating
Data into Flat file)with flat file name YYYYMMDD_HH24MISS.txt. After i want to Zip that file with same file name.
YYYYMMDD_HH24MISS.txt.zip,After that i want copy that Zip file into another location.
Thanks in advance...Hi BADDULAS,
According to the screenshot in your initial post, you are using an Execute Process Task to compress the .txt file generated by the Data Flow Task. Let’s assume that you are using the common free software 7-Zip as the executable of the Execute Process Task,
then you can refer to the following blog:
http://sqlage.blogspot.com/2013/12/ssis-how-to-compress-and-archive-file.html
To get the expected name for the txt file, you can create another String type SSIS variable @[User::FileName] using the following expression (set EvaluateAsExpression property of the variable to True):
REPLACE(SUBSTRING((DT_STR,30,1252)GETDATE(),1,10),"-","")+"_"+ SUBSTRING((DT_STR,30,1252)GETDATE(),12,2) + "24MISS.txt"
The FileName variable should be used by the ConnectionString property of the Flat File Connection Manager that is used by the Flat File Destination in the Data Flow Task.
Besides, the variable FileName should also be used in the expression of the VarSourcePath variable and the VarArchivePath variable as follows:
VarSourcePath: “C:\\Temp\\Source\\” + @[User::FileName]
VarArchivePath: “C:\\Temp\\Destination\\” + REPLACE(@[User::FileName], ”.txt”, “”)
Regards,
Mike Yin
TechNet Community Support -
Sales Order/Item Population required in CIC
Hi,
Using T.Code QM01, we get in to a screen with Sales Order, Item, Delivery , Item. Once we give the details of the SO number or any other fields, it leads us to the Customer Feedback Notification Screen with Customer Details.
Same kind of pop up with Customer Interaction Center is expected. i.e when we run T.Code CIC0, it takes you through the screens directly to Customer Feedback Notification without the popup of Sales Order/Item screen like in QM01.
Question is - we are getting Popup of Sales Order, Item..in QM01, where as it is not happening in CIC0... Can we set this popup (salesorder, sales item screen) even for CIC0.
I tired to set this in Dataflows in CIC Dataflows..but I could not get it..Please let me know how to get this happen.
Thank you
RaviHi,
Using T.Code QM01, we get in to a screen with Sales Order, Item, Delivery , Item. Once we give the details of the SO number or any other fields, it leads us to the Customer Feedback Notification Screen with Customer Details.
Same kind of pop up with Customer Interaction Center is expected. i.e when we run T.Code CIC0, it takes you through the screens directly to Customer Feedback Notification without the popup of Sales Order/Item screen like in QM01.
Question is - we are getting Popup of Sales Order, Item..in QM01, where as it is not happening in CIC0..Why?. Can we set this popup (salesorder, sales item screen) even for CIC0.
I tired to set this in Dataflows in CIC Dataflows..but I could not get it..Please let me know how to get this happen. your Ideas are highly appreciated.
Thank you
Ravi -
We have a situation where we have a job with multiple workflows, each with multiple dataflows. When the job ran we received an error on one of the dataflows.
However the job itself only shows a warning. The HDBODBC is a known warning that we ignore so our Ops team did not pick up that this job had failed.
Subsequently the steps after the failed step did not run and the monitor log shows it died in the middle of the failed step.
This is how the workflow is set up and it was the first dataflow that failed.
Long story short is this a known bug that is fixed with subsequent versions (we are on 4.0). Is there a way to work around this so that the rest of the job can continue?
Thanks,
KenSee my comments....
1-Can you please check that your DSN created for HANA database is working fine. as it says HDODBC is unknown. (KEG - My understanding is this is a known bug with 4.0 and HANA that will be fixed in subsequent versions. We plan to upgrade in the summer).
2- PLease increase the field length for WBS_TEXT. (KEG - I believe this is also a known issue where SAP ERP and HANA have datatypes of NVARCHAR but data services does not. Data services treats the nvarchar field (because if unicode one byte is represented as two) as varchar and it causes the error. This again will be fixed with an upgrade).
3 - For the first run don't create any sub_data flow. (KEG - I will look into this but I have another case where I have a job with a workflow and three dataflows. First dataflow is an RFC call back to an SAP function, 2nd dataflow processes the data, 3rd dataflow is another RFC call. No sub dataflows. In this case we get an error in the 2nd dataflow, the log shows the step marked with "proceed", but the job itself is marked Green as if it completed successfully. The 1st and 3rd dataflows show "stop" in the log).
The known issues aside in 1 and 2 above my concern is more about how data services seems to be operating when there are errors. Sometimes running additional dataflows, sometimes not. Marking the job with a warning or as successful even when there are errors. I am trying to understand if that is a setup issue with how the jobs are constructed or a known bug with 4.0. -
Hi guys,
Above 46C system, Can we use SLIN or FM 'Extended program check' instead of SCI.
Both outputs are same with respect listing Syntax errors/Warnings/Messages? please confirm.
Regards
AmbichanBoth the Outputs are not same.
SLIN is an extended syntax check.
SCI is Code inspector, a tool for checking Repository objects regarding performance, security, syntax, and adherence to name conventions and other formats that you set in.You can also determine statistical information or search for certain ABAP words (tokens). In the Code Inspector, you can define inspections that, with the help of check variants, examine certain sets of objects. As the result of an inspection, you receive information messages, warning messages, or error messages on different properties of the examined objects.
SLIN
Many checks are excluded from the standard syntax check for performance reasons. The extended program check performs a complete check that includes the interfaces of external procedures called from your program.
Errors in the extended program check cause exceptions, which in turn cause runtime errors when you run the program. You must correct them. The exception to this is coding that cannot be reached. However, you should delete this to minimize the size of your program and make the source code easier to understand.
Warnings in the extended program check should also be corrected. If your program contains statements that are definitely correct but still produce warnings in the extended program check, you can exclude them from the check using pseudocomments ( "#EC ).
You should always run the extended program check on a new program. You have not finished developing a new program until you can run the extended program check on it without any errors or warnings. Messages are permissible, since they are generally not critical.
The extended program check is also only a static check. It cannot eliminate all of the circumstances that could lead to exception situations or runtime errors. For example, any statements in which you specify arguments dynamically as the contents of fields, or in which you call procedures dynamically, cannot be checked statically.
SCI
It helps developers to adhere to programming standards and guidelines by creating messages on less-than-optimal coding. The Code Inspector offers various possibilities to define object sets and to combine multiple single checks in so-called "check variants". These functions, and the tool's parallel processing framework, make the Code Inspector a flexible and effective development assistant.
The Code Inspector can be used in various scenarios with different types of checks, thus providing insights into the code quality from various angles.
Usage scenarios
1. Single object checks from the Development Workbench
You can check a single object with the Code Inspector from the ABAP Editor (transaction SE38), the Function Builder (transaction SE37), the Class Builder (transaction SE24), or the ABAP Data Dictionary (transaction SE16). To do this, choose object Check Code Inspector from the menu, where object can be a program, function module, class, or table. The respective single objects are then checked with a default check variant.
2. Checks on transport objects from the Transport Organizer
You can invoke the Code Inspector from within the Transport Organizer (transaction SE09) to check objects in a transport request. To do this, choose Request/Task > Complete Check > Objects (Syntax Check).
3. Checks on sets of objects from transaction SCI
The Code Inspector (transaction SCI) itself enables you to create a wide range of object sets using standard selections via package, software and application component, source system, transport layer, responsible, object type, object name and so on.
In addition, special object collectors are available that allow you to read objects from a file, for example.
An object set can be combined with a check variant to a so-called "inspection" that can be executed in a single process or in parallel.
Types of checks and check variants
Below is a short extract of the types of checks and functions that are offered by Code Inspector. New checks can be implemented if required, see for example Code Inspector - How to create a new check .
Syntax
Syntax check; extended program check
Performance
Analysis of WHERE clauses for SELECT, UPDATE and DELETE ; SELECT statements that bypass the table buffer , low-performance operations on internal tables; table attributes check
Security
Usage of critical statements; dynamic and cross-client database accesses; use of ADBC-interface
Robustness
Check of SY-SUBRC handling; suspect conversions; activation check for DDIC objects
Programming Conventions
Naming conventions
Search Functions
Search of ABAP tokens; search ABAP statement patterns; search for ABAP statements with regular expressions
Metrics and Statistics
Program complexity test; statement statistics
You can combine any of these single checks into so-called "check variants", for example to check for the adherence to given programming guidelines.
Best Practices
Developers can use the Code Inspector to support their everyday work. For example, the search functions or metric checks of the tool can be a great help when restructuring the code.
The Code Inspector allows developers to define which objects are to be checked and which quality aspect of the code is to be inspected (e.g. performance, security).
It is also possible to define global check variants as general programming guidelines, to ensure standardized programming within a development community.
Check variants can prescribe for example naming conventions or other rules. The global check variants 'DEFAULT' and 'TRANSPORT' inspect objects in the development workbench and in transport requests, respectively. These check variants contain SAP-defined settings, but can be modified as needed.
Another global check variant delivered with every SAP system is 'PERFORMANCE_CHECKLIST' which helps to detect less-than-optimal coding with regard to application performance.
Hope this Helps.
Vinodh Balakrishnan -
Hi
I have a package , with 2 dataflow task,
1st dataflow task gets the data from source table and loads into the temp table , and i have an executable sql task with updates some columns with some logic ...
2nd dataflow task gets the diff data from temp table and loads into the main table .
The package always hangs ...at same position it just stays yellow in color with no error message, when i try to manyally load data i mean by copying the same query and executing ny insert select statement then the data loads perfectly , but via SSIS it always
hangs ...
Are their any changes i need to make in properties ?
Pleas hELPi checked it , the status is sleeping , and i get the event type as language event
and event info is "(@_msparam_0 nvarchar(4000),@_msparam_1 nvarchar(4000),@_msparam_2 nvarchar(4000),@_msparam_3 nvarchar(4000),@_msparam_4 nvarchar(4000))SELECT param.parameter_id AS [ID], param.name AS [Name] FROM sys.all_objects AS sp INNER JOIN sys.all_parameters
AS param ON param.object_id=sp.object_id WHERE (sp.type = @_msparam_0 OR sp.type = @_msparam_1 OR
sp.type=@_msparam_2)and(sp.name=@_msparam_3 and SCHEMA_NAME(sp.schema_id)=@_msparam_4) ORDER BY [ID] ASC"
what does this mean ?
i know the query on which the DFT is hanged ,Its a stored produre , here the syntax of the stored procedure
SELECT
* FROM DW_T_ASW_HOST_JOBDATA_TEMP t
LEFT
JOIN DW_T_ASW_HOST_JOBDATA j
ON j.JOB_STEP_ID
= t.JOB_STEP_ID
WHERE
j.JOB_STEP_ID
IS
NULL
Not sure what is the link between my stored procedure on which is package is hanged and above event info :-( -
Problem loading threat model.
Unfortunately, we were unable to load all the elements from this file. The following elements may be missing from your display, or be missing threats, mitigations, or certifications. Please check to see if the data is intact. If so, you are ok to save.
Otherwise please report the issue at
http://social.msdn.microsoft.com/Forums/en-US/sdlthreatmodeling/threads/
You can copy this message to the clipboard by pressing Ctrl-C.
(DataFlow) Application Data
(DataFlow) Application
Data
(DataFlow) Command/Response
(DataFlow) Commands/Responses
(DataFlow) Commands/Responses
(DataFlow) Commands/Responses
(DataFlow) Machine
Deployments
(DataFlow) Machine
Deployments
(DataFlow) Workflow Execution
(DataStore) McQueen
Configuration
Data
(Interactor) Avamar
Data Protection System
(Interactor) vCenter
Endpoints
(Process) vCAC
(Process) vCO
(User) Application Annie
(McQueen end user)
(User) Virtual Victor
(McQueen
Admin)
OKHi,
Do you talking about
Threat Modeling Web Applications? If yes, please post the question to MSDN forum for
Threat Model:
https://social.msdn.microsoft.com/Forums/en-US/home?forum=sdltools%2Csdlprocess&filter=alltypes&sort=lastpostdesc
PS: Here is a forum about Visio (A drawing program)
Regards,
George Zhao
TechNet Community Support
It's recommended to download and install
Configuration Analyzer Tool (OffCAT), which is developed by Microsoft Support teams. Once the tool is installed, you can run it at any time to scan for hundreds of known issues in Office
programs.
Please remember to mark the replies as answers if they help, and unmark the answers if they provide no help. If you have feedback for TechNet Support, contact [email protected] -
Code inspector for unreleased FM
Hello,
Is there a quick way to detect if a package or program is using an unreleased FM?
Thank youDear Joao,
I ran some scans on objects with not released functions but unlucky, it didn't raise any message.
However, you can scan your objects with the test 'Search Functs.' --> 'Search for ABAP Statements with Regular Expressions' and FUNCTION as expression. You will get a list with all the FM calls in your search set and can then manually check if they are unreleased. Cumbersome but better then passing thru all the sources.
Hope this will help you.
regards,
Hans -
Check Register - T.code FCHN - ZReport
Dear Friends,
I am a Functional Consultant.
The user wanted to have more fields on the standard Report Check Register - T.code FCHN.
So, I checked and found the tables PAYR,EKPO,EKKO,BSEG and related fields necessary for the development of the report.
My ABAPER is asking, how to link all the tables and what is the logic.
Please advise how to proceed?
Regards
MSReddyDear Friends,
What you say is correct. But that ABAPer who is regular employee is the blue eyed boy of the IT Dept Head and i am only a consultant.
So, I have to find the tables,logic and he writes the code.
Sad, but true.
Please suggest solution to the query posted.
Regards
MSReddy -
Maximum stock for Dangerous Goods
Good Morning,
We have a requirement in our organisation to limit the amount of stock we hold in our warehouse of a particular material.
Is there a way of setting a maximum stock holding for a particular material such that it will not allow that maximum figure to be exceeded?
Many thanks in advance for your help.
MikeHi,
There is no standard way of achieving this.
Youcould use MRP type VB and set a maximum stock level on the MRP view of the material but this would not stop any orders or recipts from happening if the maximum stock has been reached.
It might be easier to write a simple ABAP that runs regularly and checks the stock on selected items and warns you when they have been exceeded (although it may be too late, at least it would be possible to do something about it quickly).
The other alternative is to use a user exit or mod to MIGO to check the stock qty being received against the maximum qty for this material (you would have to use a new table with a list of the materials to be checked in this way and the maximum stock figures for the plant, storage location combination).
I am not aware of any standard option though.
Steve B -
Transport reqs (plng cubes/areas)
hi,
can i include multiple plng cubes to single workbench req?
while collecting cube contents, what flow i have to consider? It is purely plng cube and doesn't connected with any dtp/transformation, but i built one Bex query, can i use only necessary objects or after data flow or shall i trasnport query separately/
can i include multiple plng areas to single customizing req?
can i include multiple plng folders to single customizing req?
thanks.
rajuwhile collecting plng cube,what grouping option i have to consider.Simply i have plng cube, upon that bex query. I haven't included plng cube in any of multiprovider.
Grouping option:
only necessary objects
in dataflow before
in dataflow after
in dataflow before & after
can i include multiple plng folders to single customizing req?
regards, -
How does the codepage of the job server and the datastore work together?
<p>In Data Integrator, a datastore can be used as a source or a target of an ETL transaction (called dataflows). The dataflow is executed by a single job server. During reading or loading, Data Integrator will "transcode" from the datastore codepage to the codepage of the job server. As you can imagine, you will need to be careful when selecting the codepage of the job server, as it will need to be able to represent all characters you are trying to read/load. In other words, the codepage of the job server MUST be a SUPERset of all codepages used in all datastores which that job server may encounter.</p><p>If Great Big Company Inc. selected MS1252 (which supports many western languages including English) for the job server, Great Big Company Inc. would potentially lose characters coming in from the Korean customer database. That's why it is recommended that you choose UTF-8 for the job server when processing a mix of multi-byte / single-byte data. UTF-8 will be able to map to all of the characters used in both source databases. </p><p> </p>
It is slightly difficult to post the code as it is a huge project. I think the issue is connected with the actual 3D control, as when I copy it and paste it into another blank vi, it shows the same behavior. I have tried to attach a vi where I copied the indicator from my program and put some dummy data in. I keep getting an error 'The contents of teh attachment doesn't match its file type' . I guess I need help attaching the vi.
When I do, the following wll be true...
I've also put a few of the property nodes into the vi to see what impact they have.
Basically, run the vi and look at the data. On my screen the points look a sort of muddy brown color. With the 'fast draw' option selected, when I click and drag on the graph, the labelling disappears and the points change color to reflect the values seen on the color pallete indicator. Changing the colors in the Scatter Color Ramp control has the expected effect of chaging the plot colors, but only in the fast draw case. I want the colors to be there all the time, but cannot find the magic trick to set the pallette for the points when not in 'fast draw' mode, i.e. clicked and held.
Maybe you are looking for
-
TS4147 merging contact duplicates in address books
I tried merging contact duplicates in my address book thorugh the card- look for duplicates option but it does not seem to recognize any, althoug almost all my contacts are duplicates. Any other suggestions?? Thanks!! Kris
-
Do files need to be upgraded too?
I just upgraded to Lion. I store my files on an external back up drive. Because I just upgraded, should I re-copy those files to the external drive? Or are they the same? Do they become Lion files? Lastly, what is the best way to repair permissions a
-
Expense Report Template -Accounts code not appearing when imported to AP
Hi Gurus, I defined Expense Report Template for OU1, and for the courier expense i defined account code as 01-110-7410-00 ( Co-Dept-Courier Expns-IC)etc. with expense type as courier. But when i submitted Expense Report using the same template and wh
-
Hi all, i want to insert the xml data in my table with xmltype. At first i store these xml daten in another table as clob. But i get always the error message ORA-01729: database link name expected during the insert the xml data in xmltype column. For
-
In forground progarm working fine, but in back ground it is not working
Hi experts one program having functionality like using BDC it is creating the session and it is posting session automatically. if error occurs mail notification send to user .used this FM ---SO_NEW_DOCUMENT_ATT_SEND_API1 when i am executing the progr