Data Acqusition from Agilent 86140B

Hi, I'm using LabVIEW 6.1 and trying to read data from Agilent 86140B OPtical Spectrum Analyzer with GPIB.
I downloaded the driver set in the website, but couldn't find how to actually read the data from the device.
Any suggestion?

The drivers on the National Instruments driver site appear to all be IVI drivers, which muddies the water a bit. I'm pretty sure that to use this instrument using the IVI paradigm you also need to download some additional IVI stuff for this instrument from the Agilent site, it isn't clear. If you have the programming handbook of SCPI commands for this instrument you can write your own "non-IVI" driver in LabVIEW. First, find the "plug and play" drivers for an instrument of the same flavor (ideally an OSA, but for the most part a spectrum analyzer should do as a starting point, preferably an Agilent one). Then, go through the available vi's in this driver, selecting the ones you need initially to perform the minimum of your task, i.e. instrument initialization, measurement setup, measurement trigger, readback of measurement data and close instrument. You can then go in and edit the command strings to match the ones needed for your instrument, saving to new names that reflect your instrument model. Even without the manual you may be able to get much of the command string info from the instrument itself. Agilent, on many of their instruments, displays the command string (the SCPI standard on newer ones) that coorespond to the settings you are entering on the front panel. Tedious for sure, but if you can't find the type of drivers you need ... When looking at the "donor" instrument's drivers you may find that there are a lot of similar command strings. An OSA has a great deal of similarity to an RF SA, once you are past the front end, so setting them up is very similar.
P.M.
Putnam
Certified LabVIEW Developer
Senior Test Engineer
Currently using LV 6.1-LabVIEW 2012, RT8.5
LabVIEW Champion

Similar Messages

  • Reverse data output from Agilent 3458A

    I'm taking 500 sub-samples of a 500mS ramp waveform from a function generator with the attached vi, and the data output is reversed in time. That is, the data as displayed on the waveform chart shows the TRAILING edge first - a time-mirror image of the actual waveform captured. I thought this had to do with LIFO/FIFO, so I forced FIFO, but it doesn't help. What must I do to get the proper time-sequence data from the DMM?
    Can't seem to attach my vi , but it's identical to the Labview sample Agilent 3458 Acquire Multiple Measurement.vi, with the addition of a VISA Write MEM FIFO.

    Ok, the data just comes out as an array.  Well if you know that the data is backwards, just use Reverse 1D Array (as I already stated) to flip it back.
    I remember working with that DMM before.  I never found a way to get the data to come out like a FIFO.  That's just the way the instrument works.  Reverse the array and be done with it.
    There are only two ways to tell somebody thanks: Kudos and Marked Solutions
    Unofficial Forum Rules and Guidelines

  • Data acqusition from hp8510C using labview

    Hi ,
    I am using labview 7.0 to control hp 8510C network analyser through GPIB using drivers of hp 8510C. In the getting started VI the two options of reconfiguring and reseting seem to work fine , but the measure only option is not working in the sense that it is executing the mesurement in the instrument but no data is appearing in the computer panels for real and imaginery data. The hp 8510C panel shows 'requested data currently not available' at the same time . Can you tell me how to fix this problem.
    Kaushik

    Hi Kaushik
    I have not got the HP8510C NA or the 8510C driver!
    However have net address for the programming manual for 8510C:
    http://www.home.agilent.com/cgi-bin/pub/agilent/se​arch/r2v2/generalSearchResults.jsp?QueryValid=true
    If you have the software toolkit disk then in the basic folder 
    example 6 will provide the essential GPIB commands to achieve you goal.
    Remember that the 8510C is a pre IEEE 488.2  instrument so not all visa commands are compatible.
    Hope that helps
    xseadog

  • Acquire date from agilent dso3104 in avraging mode

    hi
    i am trying to acquire data from agilent dso 3104 x in labview . when i configure my dso in normal mode it is acquiring data but when i configure my dso in avraging mode it is not acquiring data on lab view

    I am running labview code for DSO "Agilent 2000 3000 X series" (downloaded from http://forums.ni.com/ni/attachments/ni/140/47325/1/agilent_2000_3000_x-series.zip). VI examples: "acquire waveform single" / "acquire waveform continuously", but it only acquires in normal mode. whenever i change to average mode manually or through "configure acquisition" VI, i get the same error message: No data for operation on PC as well as oscilloscope and Query unterminated.
    Pl find attached the error message.
    If i run "read current waveform" VI , even then , averaging mode is not being supported.
    I have not changed the default time out values in the default VI's or sub VI's. Pl advise on what values to take for sampling in 5Gsa/sec and averaging mode in 256 samples DSO setup.
    Attachments:
    error.jpg ‏40 KB

  • How to store the data coming from network analyser into a text or excel file

    Hii everyone
    I'm using Agilent 8719ET network analyser and wish to store the data coming from netowrk analyser into a text file/ excel file.
    Presently I'm able to get the data on Labview graph using GPIB . Can anyone suggest how to go ahead after collect data sub vi. How can the data be stored into a file apart from showing on the graph?
    Attached is the vi for kind consideration...
    Looking for help
    Regards
    Rohit
    Attachments:
    Agilent 87XX Series Exceed Max Meas.vi ‏43 KB

    First let me say that your code really looks pretty good. The data handling could be made more efficient by calculating the number of datapoints that are going to be in the completed dataset and preallocating the entire array -- but depending upon your answer to my questions, the logic in the lower shift register may be going away - so we won't worry about that right now.
    The thing I need to know before addressing the data storage question is: Each time you call "Collect and Display Data.vi", how many element are in the array? Are you reading single data points, or a group of data? (BTW: if the answer to that question is obvious based on the way the other VIs are setup, I don't have the drivers so I can't tell what the setup values are.) Second, how fast does the loop iterate? Are we talking msec per loop?, seconds? fortnights?
    The issues here are two-fold: how much data? and how fast is it coming? The answer to these will tell you how to save the data.
    Mike...
    Certified Professional Instructor
    Certified LabVIEW Architect
    LabVIEW Champion
    "... after all, He's not a tame lion..."
    Be thinking ahead and mark your dance card for NI Week 2015 now: TS 6139 - Object Oriented First Steps

  • Data Recovery from an encrypted Time Machine HD

    Does anyone have eny experience with having data retrieved from a time machine hard drive that is encrypted? I've clean installed Mountain Lion and now can't get at my time machine to restore files.
    Apple support recommended I posted here and explore 3rd party recovery servies.
    Please please anyone!

    v8r wrote:
    It was my first time machine back up. Mid way I felt that it is stuck so I powered down my mac book pro. Now its not booting. I gave the details in a simultaneous thread "Gray screen, spinning cog and midway to first back up of time machine". I am assuming that if I have lost all the data on my harddrive, can I at least get some from the incomplete time machine back up.
    well, it might be possible but it won't be in any form that you can retrieve automatically. so your first priority should be to get your computer to boot and after what you can start worrying about getting your data from an incomplete TM backup. I've done that it the past and it works sometimes but it really depends on the situation. I had to do it from terminal too as it didn't work from finder.
    but it need not be that you lost all data on the main drive. boot from the SL install dvd and see if disk utility sees the drive. if so, repair the drive and try booting from it. if you have disk warrior it might help too if disk utility can not fix some errors.
    Thanks,
    v8r.

  • No master data transmitted from ERP to GTS

    We are using ERP 4.70 with the following settings:
    SAP_APPL 470 SP-Level 0031
    PI 2004_1_470 SP-Level 0016
    PI_BASIS 2005_1_620 SP-Level 0017
    SLL_PI 720_470 SP-Level 0007
    together with SAP GTS 7.2 SP-Level 09
    I created the message types /SAPSLL/DEBMAS_SLL and /SAPSLL/MATMAS_SLL manually and activated them.
    I try to send an customer master data to GTS but the partner will not transmitted. In the log protocoll in GTS the error message appears that there was no data collected from table SAPSLL/TCOGVA
    If I try to send a material master from ERP to GTS the RFC connection gets broken with error message:
    RFC error (The transaction has dumped the connection). What I had found out is, that the error appears in the function /SAPSLL/API_1006_SYNCH_MASS.
    Has this something to do maybe with a wrong IDOC type? Does I have to to something in some tables. Do I use the wrong programme? I use /SAPSLL/MATMAS_DISTRIBUTE_R3 and /SAPSLL/DEBMAS_DISTRIBUTE_R3.
    I made absolutely the same customizing settings in another system at a customer and it worked fine there.
    Does anybody has an idea? Thanks to all for the possible help.
    Thanks very much in advance

    Hi Andreas,
    there are several possibilities why this is not working.
    First of all you should check, if your RFC settings are correctly maintained.
    Please bear in mind that the logical systems have to be assigned to logical system groups and that GTS and R/ shall not be in teh same group.
    Please also ensure that the logical system name is the same in both systems, R/3 and GTS.
    Secondly there is also the possibilites that necessary tables are not filled correctly.
    Please run report /SAPSLL/PLUGIN_CHECK_R3
    It could be thate.g. the tables TBD24 and TBD62 are not filled 
    for the relevant message type (in your case /SAPSLL/DEBMAS_SLL and /SAPSLL/MATMAS_SL)
    To update this tables corretly pls. do the following:                                                                               
    - BD53                                                                
    - select message type (/SAPSLL/MATMAS_SLL)and edit it                 
    - select a segment                                                    
    - save                                                                
    - deselect segment again                                              
    - activate                                                            
    - set in trx BD60 the FM to the correct value                         
      (/SAPSLL/MATMAS_DISTRIBUTE_R3)                                                                               
    You can than check with the programm: '/SAPSLL/PLUGIN_CHECK_R3' if    
    all 4 required tables (TBD24, TBD62, TBDA2, TBDME) are filled, or not.
    I hope this helps with the issue.

  • Data import from EBS failed via FDMEE in fdm . Getting error message as "Error connecting to AIF URL.

    FDM Data import from EBS failed via FDMEE after roll back the 11.1.2.3.500 patch . Getting below error message in ERPI Adapter log.
    *** clsGetFinData.fExecuteDataRule @ 2/18/2015 5:36:17 AM ***
    PeriodKey = 5/31/2013 12:00:00 AM
    PriorPeriodKey = 4/30/2013 12:00:00 AM
    Rule Name = 6001
    Execution Mode = FULLREFRESH
    System.Runtime.InteropServices.COMException (0x80040209): Error connecting to AIF URL.
    at Oracle.Erpi.ErpiFdmCommon.ExecuteRule(String userName, String ssoToken, String ruleName, String executionMode, String priorPeriodKey, String periodKey, String& loadId)
    at fdmERPIfinE1.clsGetFinData.fExecuteDataRule(String strERPIUserID, String strDataRuleName, String strExecutionMode, String strPeriodKey, String strPriorPeriodKey)
    Any help Please?
    Thanks

    Hi
    Getting this error in ErpiIntergrator0.log . ODI session ID were not generated in ODI / FDMEE. If I import from FDMEE its importing data from EBS.
    <[ServletContext@809342788[app:AIF module:aif path:/aif spec-version:2.5 version:11.1.2.0]] Servlet failed with Exception
    java.lang.RuntimeException
    at com.hyperion.aif.servlet.FDMRuleServlet.doPost(FDMRuleServlet.java:76)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
    at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
    at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
    at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:301)
    at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:27)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
    at oracle.security.jps.ee.http.JpsAbsFilter$1.run(JpsAbsFilter.java:119)
    at oracle.security.jps.util.JpsSubject.doAsPrivileged(JpsSubject.java:324)
    at oracle.security.jps.ee.util.JpsPlatformUtil.runJaasMode(JpsPlatformUtil.java:460)
    at oracle.security.jps.ee.http.JpsAbsFilter.runJaasMode(JpsAbsFilter.java:103)
    at oracle.security.jps.ee.http.JpsAbsFilter.doFilter(JpsAbsFilter.java:171)
    at oracle.security.jps.ee.http.JpsFilter.doFilter(JpsFilter.java:71)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
    at oracle.dms.servlet.DMSServletFilter.doFilter(DMSServletFilter.java:163)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
    at weblogic.servlet.internal.RequestEventsFilter.doFilter(RequestEventsFilter.java:27)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:57)
    at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3730)
    at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3696)
    at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
    at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2273)
    at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2179)
    at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1490)
    at weblogic.work.ExecuteThread.execute(ExecuteThread.java:256)
    at weblogic.work.ExecuteThread.run(ExecuteThread.java:221)

  • Getting data transfered from old HD to new ones, leaving 2 slots filled . .

    Hello,
    *(Below is a thread I started in another forum, but, didn't get my final questions answered (see at bottom of this post in BOLD), and am hoping for further help and suggestions. I know there is a lot to read here, but, I think this information will help other novices, such as myself. Thanks!)*
    I have 2 new HDs (each 1T) uninstalled, and I have my old HD (160 gigs) currently residing in one of the two slots in my G5. Any ideas on the best way/process to get the data transfered from the old HD to the new ones, leaving my two slots filled with the 2 new drives AND my data from the old HD transfered and intact, with a RAID0 setup?
    OR, are there any other suggestions on a good setup with the 2 new HDs??
    I'm open to suggestions.
    I always back-up on external HDs, so, is their a need for me to worry about setting up a RAID0??
    I'm doing video processing for DVD release.
    Should I just max out the 2x1T gig HDs and not worry about a HD failure??
    All Suggestions welcomed!!
    Thanks, Jack
    *FROM THE CUTTERMAN:*
    You are limited with only 2 drive slots. In this scenario what usually works best is a small fast drive (eg Velocirapter or SSD) for OSX/Applications and a large drive for video files. It is not advisable to have the operating system running from a striped array.
    Since you already have the drives, here is what you can do.
    Install one new drive and format the partition you wish to use for OSX. You may not want to use the whole drive for OSX, so make 2 partitions. Be sure that it is GPT-formatted so it can boot.
    Use CarbonCopy Cloner to copy your current OSX partition to the new one.
    Set the startup disk as the new partition.
    Remove the old drive and replace it with the second new one.
    Reboot, and partition/format new drive.
    *Hey Cutterman,*
    Thanks for the advice. That sounds like a good set-up. I'll have to research the process of doing what you've advised, but, should work out ok:
    A. Install one new drive and format the partition you wish to use for OSX. You may not want to use the whole drive for OSX, so make 2 partitions:
    1. Partitions are done under Disc Utilities correct? Never done it before; will take a look.
    B. Be sure that it is GPT-formatted so it can boot.
    2. GPT is also done under Disc Utilities, correct? Here I start to get a bit confused, as I've formatted Extended Journal, SO, the partition with the OSX & Applications is formatted GPT, and the rest of the HD is formatted as Extended Journal?
    C. Use CarbonCopy Cloner to copy your current OSX partition to the new one.
    3. So, I partition the first new HD and use Carbon Copy Cloner (also under Disc Utilities) to copy the OSX & Applications from my original HD to the first new HD, correct?
    D. Set the startup disk as the new partition.
    4. This will be an option when I setup the partition???
    E. Remove the old drive and replace it with the second new one.
    F. Reboot, and partition/format new drive.
    5. QUESTION: So, you recommend not striping the HDs: If I do a RAID 1, which sets up a mirror of the HDs, and since the first new HD is partitioned with the OSX & Applications, AND is set up as the Startup Disk, then the second new HD will also mirror that?? Another QUESTION: Even if the second new HD does not mirror the OSX & Applications, the second HD will only mirror the space/partition not utilized by the OSX & Applications partition, correct?
    Thanks for your help!!
    Jack
    *FROM THE CUTTERMAN:*
    OK I will try to answer your questions. To begin, if you want to mirror the 2 drives then you will need to boot from the SL install DVD or an external (USB/firewire) drive to set it up. IMO for your purposes it is too much hassle and a waste of HD space.
    1) Yes partitions are created in Disk Utility
    2) Yes, you choose a volume Scheme (ie number of partitions) and size them by dragging the separator bar. Then select the planned boot partition and click Options..., then choose GPT. I think it is usually the default but check and make sure. Extended journaled is the usual format.
    3) You need to download Carbon Copy. It is a free tool that will copy the entire image of your current system partition to the new one and make it bootable. Consider making a donation as it is a very useful and frequently updated utility. It is fairly intuitive to use. You can also use the restore feature in Disk Utility but I have had more experience with Carbon Copy.
    4) Once the copy process is finished the new boot partition will show up under the startup disk selections (this tool is in System Preferences).
    *FROM 666Sheep:*
    If may i correct one thing: OP, don't choose GPT (GUID Partition Table). You got PPC Mac (G5), so valid partition type for you is Apple Partition Map (APM).
    GPT is for Intel Macs and you will not able to boot from this kind of partition.
    *FROM THE CUTTERMAN:*
    Thanks for the correction- my bad. No familiarity with non-Intel Macs.
    G5 PowerMac, 2.5 GHz Dual, Dec. 2004, 6.5 GB RAM, 149 GB HD
    *Hi All,*
    *Since this process will come to a head on Tuesday/Wednesday, and I hope to be successful in transferring Old HD data to new HD/s.*
    *A: I'm still a little foggy here, as I thought I'd set up the 2 new T-gig HDs so that if one of them failed, I'd still have the data backed up on the other HD; more ideas concerning this would be reassuring:*
    *WHAT DOES "IMO" mean?*
    *(IMO for your purposes it is too much hassle and a waste of HD space.)*
    *As I understand the points made: After I partition the first new HD, and Carbon Copy data from old HD, I then install second new HD, and this will just act as an overflow from the first partitioned HD, correct? Because, if set up a Raid1/Mirror, the OSX partitioned portion of the first new HD will not copy to the second new HD, and that space on the second new HD will be wasted, correct?*
    *Also: I have to set up RAID1 (if I do not set up RAID0), correct?*
    *RAID 1*
    *From WIKI: RAID 1 mirrors the contents of the disks, making a form of 1:1 ratio real time mirroring. The contents of each disk in the array are identical to that of every other disk in the array. A RAID 1 array requires a minimum of two drives.*
    *Carbon Copy is a free tool that will copy the entire image of your current system partition to the new one and make it bootable.*
    *1. Is my old HD partitioned? Is that done automatically by MAC prior to PC purchase?*
    *2. Do I Carbon Copy the whole old HD or just parts? (eg. OSX, Photo Shop, After Effects, Various Files)*
    *3. I read that it is necessary to DEACTIVATE Photo Shop (I have CS3) prior to doing Carbon Copy. Is that correct? If so, does this also apply to other Adobe applications? Such as After Affects, Illustrator??*
    *Thanks to all for the help!!*
    *G5 PowerMac, 2.5 GHz Dual, Dec. 2004, 6.5 GB RAM, 149 GB HD*

    Confusing Information:
    A1: You also want to make sure the drive you are backing up to is formatted Mac HFS Extended (HFS+) if using Mac OS 8.1 or above.
    A2: On PowerPC Macs, your clone should be partitioned as Apple Partition Map.
    *Q? I thought the new HD (1T) should be formatted in Extended Journaled?*
    B: Also disable Spotlight (in 10.4 only) on your destination drive using Apple menu -> System Preferences -> Spotlight -> Privacy to add the destination drive to the pane.
    *Q? I'm using 10.4.11, so, I should follow these instructions?*
    C: If possible, boot into safe mode to perform the backup (holding the SHIFT key at startup). In addition, you can clone while logged into another administrative user that you don't use at all to avoid further complications of changes which may be happening to your regular user (though don't use Fast User Switching to get into that other user, since that other user is still active when fast user switching is used). Otherwise you are going to be running a backup on a live system which could have changes happening while you are attempting to backup. These may yield an imperfect clone, with uncertain success at recovery. It may be possible that your clone will have its own hardware issues, so make at least two copies.
    *Q? ?????????????*
    *I GOT THE ABOVE INFO. FROM THE FOLLOWING SITE:*
    Making a clone/mirror/duplicate backup
    http://www.macmaps.com/backup.html#SHORTANDEASY
    *ALSO, WHAT DO YOU THINK ABOUT THE ADVICE ON THIS PAGE:*
    http://www.levoltz.com/2010/04/21/how-to-transfer-data-to-your-new-hard-drive/

  • Unable to delete data files from my Ipod if used as data storage...

    Hello everybody
    I bought months ago an Ipod Nano 8GB. I've got an issue when i use it as disk storage i'm not able to sort out. I've checked all the topics but i cannot find any solution.
    Example: I use the Ipod with the option "Manually manage music and videos". I put audio and video files on my Ipod through Itunes, then i put also some files on my Ipod as normal data files through my Mac Finder (Easy, i drag VLC, JPG files etc. from folders to my Ipod). The Ipod/Itunes summary page shows (Correctly) the capacity column colored in blue (Audio), video (Purple) and orange (Data). No probs when i transfer data files to another computer.
    Problem: when i delete the data files from my Ipod (Through finder) the orange column doesn't disappear (I have no problems if i delete audio/video files via Itunes) and although i've actually deleted all the files if i try to add some other data or videos/music files to my Ipod, Itunes says that there is no space available in my device. It's very strange, i check and double check all the options but eventually i have to restore my Ipod all the times and (Pretty annoying) add all my files again.
    It seems that although i sent my files to the trash bin, Itunes (Fully updated) doesn't recognize the change.
    It'd be great if someone can help me out as it's very important for me to be able to use my Ipod as disk storage, thanks a lot!

    It seems that although i sent my files to the trash bin, Itunes (Fully updated) doesn't recognize the change.
    Empty it from the Finder menu or by control-clicking it in the Dock.
    (44613)

  • Data Load from a DSO to a Cube

    Hello,
    I am facing a problem
    Case:
    There is a DSO with following Char/Key Figure:
    Characteristics     ( Value-1 )     ( Value-2 )     ( Value-3 )
    A(Key)          ( A1 )     ( A2 )     ( A3 )
    B(Key)          ( B1 )     ( B2 )     ( B3 )
    C          ( C1 )     ( C1 )     ( C1 )
    D          ( D1 )     ( D1 )     ( D1 )
    E          ( E1 )     ( E1 )     ( E1 )
    F          ( F1 )     ( F1 )     ( F1 )
    KF1(Key Figure)     ( 10 )     ( 20 )     ( 30 )
    I got below data from the DSO to a Cube(structure is below with data) details are below:
    Dimention          ( Values )     
    C          ( C1 )          
    D          ( D1 )          
    E          ( E1 )          
    F          ( F1 )          
    KF1(Key Figure)     ( 50 )
    My Question is why i am not getting KF1 value as 60.
    When i tried with including Key Fields of the DSO, i got all the records in the Cubes.
    Where i am mistaking?
    Can some one correct me please?
    Thanks.
    Avinash.

    Below are the basic steps which we follow in any BI 2004S system:
    1)Create datasource. Here u can set/check the Soucre System fields.
    2)Create Transformation for that datasource. (no more update rules/transfer rules)
    2.1) While creating transformation for DS it will ask you for data target name, so just assign where u want to update ur data.
    DataSource -> Transformation -> Data Target
    Now if you want to load data into data target from Source System Datasource:
    1) Create infopackage for that data source. If you are creating infopackage for new datasources, it will only allow you update upto PSA, all other options u can see as disabled.
    2)Now Create DTP (Data Transfer Process) for that data source.
    3) NOw schdule the Infopackage, once the data is loaded to PSA, you can execute your DTP which will load data to data target.
    If you are loading data from one one data target to other, no need to use PSA, you can directly execute DTP in that case.
    Data Source -> Transformation (IP/DTP) -> Data Target1 -> DTP ->Data Target 2
    Use the below link for detailed example:
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/fc61e12d-0a01-0010-2883-e2fc63ef729b
    Infosources are no more mandatory with BI 7.0, below is the link to scenarios where we use infosources:
    http://help.sap.com/saphelp_nw04s/helpdata/en/44/0243dd8ae1603ae10000000a1553f6/content.htm
    Full or delta depends on your requirement...
    chk the below thread to know better
    difference between the various loads
    hope it helps
    Message was edited by:
            sriram viswanathan

  • Data loading from DSO to Cube

    Hi,
    I have a question,
    In book TBW10 i read about the data load from DSO to InfoCube
    " We feed the change log data to the InfoCube, 10, -10, and 30 add to the correct 30 value"
    My question is cube already have 10 value, if we are sending 10, -10 and 30 Values(delta), the total should be 40 instead of 30.
    Please some one explaine me.
    Thanks

    No, it will not be 40.
    It ll be 30 only.
    Since cube already has 10, so before image ll nullify it by sending -10 and then the correct value in after immage ll be added as 30.
    so it ll be like this 10-10+30 = 30.
    Thank-You.
    Regards,
    Vinod

  • How to get data back from an action ?

    Hello,
    Would it be possible to get data back from an action (out of the question EO_MESSAGE & ET_FAILED_KEY )?
    For example we got an order with order positions and we need a "function" to e.g. count all positions. Because of Performance the function should not be processed each time the order is changed, read or a position is added. Instead the function should be processed only if it was called explicitly.
    Is it possible to create a kind of action which is actually counting all entries and export the number of them?
    How to mark a parameter in is_parameters as exporting?
    Is this just done by (naming-) convention?
    What is the preferred way to have “methods” with returning/exporting values?
    Regards,
    Lorenz

    Hello Lorenz,
    As you have already figured out , the Action API provides you with only the messages and failed keys if any.
    Post action execution , you can always execute a retrieve or retrive by association , to get the latest buffer snapshot , which of course would include the changes that you have made in your action.
    If you want to ensure that users have explict control on execution of your "fucntion", then of course , you should model it as an action on the BO.
    The parameter is_parameters is an IMPORTING parameter. You CANNOT use it to export anything back from the action. For importing ,  you can of course have any structure to use as the is_paramaters , which you model as the action parameter structure which modelling your BO action.
    From an external entity the only way to interact with a BO is by consuming the BO services and you are bound by the BOBF standard interfaces. Any and all data you require needs to be modelled as node attributes ( persistent or transient ) and fetched using the RETRIEVE, RETRIEVE_BY_ASSOCIATION or QUERY services.
    Regards,
    Indranil.

  • How to get Date Format from Local Object.

    Hi All,
    I am new to Web Channel.
    I need to know Date format From date of locale.
    suppose there is a date "01/25/2010" date in date field I want to get string "mm/dd/yyyy". Actually I have to pass date format to backend when I call RFC. 
    Is there any way to get Date format from "Locale" object. I should get date format for local object
    I get local object from "UserSessionData" object but how to get Date format from it.
    I am not looking for Date value. I am looking for current local date format ("mm/dd/yyyy or dd/mm/yyyy or mon/dd/yyyy) whatever local date format.  I could not find example which show how to get date format from "Locale" object.
    Any help will be appreciated with rewards.
    Regards.
    Web Channel

    Hi,
    You can get it from "User" or "Shop" business object.
    Try to get User or Shop Business Object as shown below.
    BusinessObjectManager bom = (BusinessObjectManager) userSessionData.getBOM(BusinessObjectManager.ISACORE_BOM);
    User user = bom.getUser();
    char decimalNotation = user.getDecimalPointFormat().getGroupingSeparator();
    If you are seeing "1,234.00" then above code will return "."
    I hope this information help you to resolve your issue.
    eCommerce Developer.

  • How to pass a date parameter from report builder query designer to oracle database

    i'm using report builder 3.0 connected to oracle database. i'm trying to pass a date parameter in the query with no success, i don't
    know the exact syntax. I've tried :
    SELECT * FROM igeneral.GCL_CLAIMS where CREATED_BY IN (:CREATED_BY) AND CLAIM_YEAR IN(:UW_YEAR) AND (LOSS_DATE) >To_Date('01/01/2014','mm/dd/yyyy')
    it worked perfectly.
    However if i try to put a date parameter "From" instead of 01/01/2014 it will not work, a Define Query Parameter popup window appear and error occurred after i fill
    the values (usually i shouldnt get this popup i should enter the value when i run the report)
    SELECT * FROM igeneral.GCL_CLAIMS where CREATED_BY IN (:CREATED_BY) AND CLAIM_YEAR IN(:UW_YEAR) AND (LOSS_DATE) >To_Date(:From,'mm/dd/yyyy')
    appreciate your assistance

    Hi Gorgo,
    According to your description, you have problem when in passing a parameter for running a Oracle Query. Right?
    Based on my knowledge, it use "&" as synax for parameter in Oracle, like we use "@" in SQL Server. In this scenario, maybe you can try '01/01/2014' when inputing in "From". We are not sure if there's any limitation for To_Date()
    function. For your self-testing, you can try the query in sqlplus/sql delveloper. Since your issue is related to Oracle query, we suggest you post this thread onto Oracle forum.
    Best Regards,
    Simon Hou

Maybe you are looking for