Export Input Level Data (doesn't)

I have read about this somewhere, just wondering if anyone could explain what's going on..I am attempting to export input level data, and when I check the export file, the upper-level input-level data is clearly absent. The missing upper-level input-level data is in the sparse dimension, organization.Any ideas on why this is happening? Is this a bug (6.1.3a)?Thanks,Jeff McAhren

I'm using 6.5.1 and that process works in this version.RegardsCorey BidmeadClarity Systems

Similar Messages

  • Need to export customer master data in a batch input file

    Hello all,
    I'm searching for a solution to export customer master data out of a R/3 system (Release 470).
    The challenge is that the file must have the same structure like the file for batch input of customer master data (import file of the program RFBIDE00).
    So, has anybody an idea, if there is a standard tool to get such a file? Or has anybody developed a report creating this structure by his own?
    Thanks for your help,
    kind regards,
    Dietmar

    Hi,
              I have also come across the same problem as above. There is a standard program RFBIDET0 which has to be customized to get the file which can be uploaded to RFBIDE00. Do anybody come across this kind of problem before and have  a program already. It will be of great help, if you can forward the program or guide.
    Regards,
    Vishnu Priya

  • Export BW-BCS data to the ECC

    Hi,
    I got request to create this customized report: ECC-BCS GL Statement Report in ECC. Basically in this report, user can compare the total GL figure in both server: ECC and BCS. At the same time, user also can track all the adjustment done in BCS via UCMON.
    List of report fields:
    below list is from ECC table
    1. Company Code
    2. GL Account
    3. GL Description
    4. Total GL in ECC
    below list is from BCS table
    5. BCS-00 --> for posting level 00
    6. BCS-01 --> for posting level 01
    7. Total GL in BCS (00 + 01)
    8. ECC-BCS Diff (4 - 7)
    To get the BCS data into ECC, my plan is to create the RFC programming.
    I managed to get field 1 til field 4 by calling tcode F.01. To get field 5 til field 7 is from List of Totals Records report in UCMON.
    At first I thought to use BDC to call the the report in UCMON. Then i got problem on the screen handling.
    My last choice is to find the exact table for that report in BW. then using select statement, I pull data for field 5 til field 7. But then, until now I can't find the table that keep all the BCS data. I managed to display the data from the cube (ZBCS_C11). Now I need to know how to export the BCS data to the ECC.
    Appreciate if you could share some input on this. Otherwise my user have to maintain download the GL number from ECC and BCS. Using both information, user need to v-lookup to match the BCS to ECC
    thanks

    Hi Alisher,
    You are refering to transformation file for record count. But in transformation file we mention the IDs which are displayed in some other format and we want to change them to the other.
    In your case, would like to know, do you have some property in dimension which displays the record counts? If you have then you can design it using transformation and conversion files. But incase you dont have any such property then you can not add new column in the export file.
    Hope this clarifies.
    Rgds,
    Poonam

  • Export only specific data in XML with Submit by Email button

    I have created a form with multiple subforms that includes user input and calculations. One of these subforms contains fields that are bound to a MS Access database to retrieve data for calculations. All of the fields on this subform are bound to the data connection. The data connection, calculations, and submit by email button all work perfectly. My only problem is that the XML data that is being exported includes the data I bound to the MS Access database. I only want the data that the user inputs. I have changed the binding to "none" on all of the calculated fields to restrict them from exporting.
    I have attached a copy of the XML file that is generated when the "Submit by Email" button is clicked. I only want the data from the "YTDIncStmt" subform to be included in this file. The "MasterData10K1", "MasterData10K2", and "MasterData10K3" are the actual data connections to the MS Access database.
    How do I limit the export to only the data the user input?

    I have the very same problem! I am importing data from MS Access and trying to export XML data that I can sync with Quickbooks. But the XML file contains some of my binded MS Access fields, which Quickbooks doesn't know what to do with, and rejects my xml files. I am also looking for a way to submit only selected XML data.  You ever find a solution/work around?

  • How can I export complex FRF data in ASCII files especially UFF58 format?

    Background:
    I have a problem saving analysed data into a format that another software program (MODENT) can import.
    I am using "SVA_Impact TEST (DAQmx).seproj", to acquire force and acceleration signals from a SISO hammer model test, and process up to the stage of an FRF and coherence plot. I need to export the FRF information into MODENT, meaning it must contain both magnitude and phase (real and complex) information in one file. (MODENT does not need the coherence information, which is only really used during the testing process to know we have good test data. The coherence information can already easily be saved into ASCII text file, and kept for test records - not a problem).
    MODENT is a speciaised modal analysis package, and the FRFs from tests can be imported into its processing environment using its "utilities". The standard is a UFF58 file containing the FRF information.
    Labview does not seem to be able to export the FRF data in UFF58 format. In general, Labview can export to UFF58, but only the real information, not the complex information - is this correct?
    Is there a way to export the real and complex information from FRFs using one file.
    I am working on this problem from both ends:
    1. MODENT support, to help interface into MODENT.
    2. Lavbiew support, to try to generate an exported file format with all necessry analysed data.

    Hi
    Have a look at the "seproj" and "vi" versions I mentioned in this thread - both available from the supplied example VIs in Signal Express and Labview. I've also attached these here.
    The "seproj"version has what are called "limits" for the input data, meaning it cheks certain criteria to make sure the test data sentfor analysis (to give the FRF) is wihin certain limits: 1. "overload" where the amplitude must be below a set maxiumum, 2. "time", where the signal must deacy to zero within the time envelope. These are useful test check functions to have.
    I looked at the conversion from SE to Labview VIs, but if I understand correctly, this essenetially embeds the SEproject in Labview, but does not allow the same use of viewing of the user windows during testing (eg seeing the overload plots, editing the overload levels, etc)?
    Other settings:
    -signal input range, sensitivity, Iex source, Iex value, dB reference, No. of samples and sample rate.
    -triggering off the hammer, with all the setting optons shown in step setup for "trigger"
    -all settings associated with the load and time limits; presented in a very useful format of either defining by X,Y coordinates, or by scalling.
    -all config and averaging options for the freqency response.
    The practise envirnoment in the SEProj version is also useful, as getting the technique right for using the hammer is not always easy, and it allows a quick check of basic settingsbefore doing the actual tests.
    The Nyquist plot in the VI version is useful.
    Happy New Year to everyone !
    Attachments:
    SVA_Impact Test (DAQmx) demo.seproj ‏307 KB
    SVXMPL_Impact Test (DAQmx).vi ‏75 KB

  • My Blu-Ray player has a USB input but it doesn't have an HDMI input. Can I connect my Apple TV to the Blu-Ray using an HDMI/ USB adapter? It would allow me to run the audio through my stereo. Thanks to all for your help.

    My Blu-Ray player has a USB input but it doesn't have an HDMI input. Can I connect my Apple TV to the Blu-Ray using an HDMI/ USB adapter? It would allow me to run the audio through my stereo. Thanks to all for your help.

    Can I connect my Apple TV to the Blu-Ray using an HDMI/ USB adapter?
    If you can find such a thing.  I really doubt that exists since USB is a data connector not a video connector.
    would allow me to run the audio through my stereo.     
    That would depend on this adapter.

  • AUTHORIZATION ISSUE: cube level data restriction in BI

    Hi all,
    I have few cubes and ODS which are containing data. The requirement is to restrict the cube level data.
    Eg : we have option to see the cube data in RSA1( ADMIN WORKBENCH).Right click on cube manage data.
    the requirement is to restrict to see the data company code = 111(only)
    Likewise for few users only company code = 222.
    If they try to see other than 111, they should get u201Cno authorization messageu201D.
    Cube data               
    company code         distribution channel         account     amount
    111                         10                                   10002     100
    222                        20                                      10002     200
    333                        30                                       10002     300
    444                        10                                      10002     400
    1111                20                                       10002     500
    1111                30                                      10002     600
    2222               30                                           10002     700
    Thanks in advance.
    Jo

    Hi MaikI,
    Thank for the inputs.
    Actually i want restrict the data based on the ANALYSIS AUTHORIZATION - 0TCAIPROV.
    I want to give s_rs_comp-provider value *, and i want to control the query (data) access through analysis authorization. i want to create zanalysisauth which contains 0TCAIPROV = $variable.
    variable is populated with one/two provider values at runtime .
    based on the runtime variable population user should get access.
    But with this implementation user is able to open all queries.where i am going wrong? how can i do this?
    Regards,
    Joseph

  • Export to Excel functionality doesn't work when DataControlScope is shared.

    Hi All,
    I have a taskflow whose transaction setting is as Below:
    1.Always Begin New Transaction
    2.DataControlScope is isolated(Share data controls with calling task flow checkbox is unchecked).
    I have an Export to Excel functionality with exportCollectionActionListener.The jsff source snippet is given below.
    <af:commandMenuItem text="#{smviewcontrollerBundle.EXPORT_TO_EXCEL}"
    id="cmi1" icon="/images/excel_icon.jpg"
    binding="#{backingBeanScope.backing_fragments_RefSetSearch.cmi1}">
    <af:exportCollectionActionListener exportedId="resId1"
    type="excelHTML"
    filename="SmRefSetCodes"
    title="System Reference Inquiry Result"/>
    </af:commandMenuItem>
    The jsff is used in the above mentioned task-flow.Having this configuration in task-flow,export to excel doesn't export any table data in the excel sheet.But the moment,data-control-scope is changed to shared,export starts working.
    Is there any specific reason on this kind of behavior.
    In all other task-flows in my application,I have data-control-scope as isolated only,but there,export is working perfectly fine.
    Please help.
    Thanks,
    Gaurav

    Hi Frank,
    Thanks for your reply.
    Exactly,I have just one task-flow in which I have a page fragment for which export to excel is not working.The moment I change the data-control-scope to shared,export functionality starts working.
    I do have correct table id mentioned in exportCollectionListener.
    Is there anything I need to look into?
    Thanks,
    Gaurav.

  • Samson C01U usb mic/low input level

    I recently bought a Samson C01U usb condenser mic to use with Garageband to record voice and piano for my music studio. When I plug it in, it is recognized in system preferences/Sound. But when I try to record in Garageband, the input level monitor barely responds. I've selected Mono 1, as advised by tech support at Samson. I can't figure out why this mic isn't working, since it's recommended for this exact purpose. I don't think I should need a pre amp, but now am wondering if I do. I'd be so grateful for any advice.
    KM

    You know, the Samson C01U worked ok with the SoftPre software from Samsontech under Tiger. The input levels could be bumped up sufficiently for the **** thing to actually work even if the input signal was still on the low side.
    Since SoftPre doesn't work with Leopard, the C01U is basically an expensive paperweight.
    Since Samsontech seem incapable of getting their act together in putting out an operational Leopard compatible SoftPre software release, my advice to the wider Mac community is to BOYCOTT THEM! If Samsontech aren't receptive to consumer needs and flexible enough to act accordingly when the OS is updated, they deserve to suffer the commercial consequences - period!
    I'm looking forward to the liberating ritual of taking a giant sledgehammer and obliterating my now useless, overpriced Samson C01U. It's going to be awesome !

  • Sysgen : How to read the input port data type, width and rate dynamically in a masked subsystem ?

    Hello everybody,
         I am designing a general purpose block in system generator. I pass the user parameters to the block through masking it. Some user parameters can change the block configuration. The input port data type, width and rate can also affect the block configuration.
         The problem is that these values (input port data type, width and rate) are subject to change. So I should read them dynamically, then change the block configuration through programming the "Initialization Commands" field. But unfortunately there is no straight way to read the input port information.
         There are some methods in for example the "Black Box". these are:
    input_width = this_block.port('din').width;
    input_rate = this_block.port('din').rate;
    But these methods are not applicable to a masked subsystem.
    I have tried other ways also. You can find them below. None of them worked.
    Does anybody know how can I solve this problem?
    Other ways I tried:
    1)
    design_name([],[],[],'compile')                                       
    q=get_param(gcb,'PortHandles');
    get_param(q.Inport,'CompiledPortDataType')
    get_param(q.Inport,'CompiledPortWidth')
    get_param(q.Inport,'CompiledPortDimensions')
    design_name([],[],[],'term')
    2)
    ssGetInputPortDataType
    3)
    ts = Simulink.Block.getSampleTimes([gcb '/Input'])
     

    Today we rely on Simulink to perform parameterization of your designs in two ways:
    Parameterizable Subsystems and Blocks : Parameters themselves can be MATLAB expressions that need to be evaluated for which we need the MATLAB interpreter
    The very useful Rate and Type propagation or Simulink compilation that allows us to specify types & rates in one location that gets systematically propagated to all.
    To truly make the HDL Netlist that is generated from SysGen parameterizable, we would have to implement some of this capability in the HDL netlist itself by:
    Using Generics(VHDL) or Parameters(Verilog) - We would have to capture the bit width(type) propagation through levels of hierarchies and finally parameterize the IP itself based on this value
    Since IP itself does not have this capability through generics, we would have to package a separate tcl script that updates the IP parameterization appropriately in response to top level parameters(or GUI parameters)
    Interpreting MATLAB expressions and translating them into VHDL/Verilog expressions (alternatively tcl expressions of IP). In simulink, mask parameters can be passed from one level to the next. Also parameterization of a block can be composed of Matlab expressions using variables from ancestor masks & the MATLAB interpreter – so we will need to somehow capture that as well.
     

  • Ideas for Providing User Level Data Backup and Restore

    I'm looking for ideas for implementing a user level application data backup and restore in an Apex app.
    What would be great is to have a user be provided an export file and a way to import this file. A bit overkill but hopefully never needed.
    Another option that is perfectly doable is a report that simply provides a means to create an export of the data. Since I already have an interface I can use an export to interface an export.
    Any thoughts?
    Hopefully I'm missing something already there for an end user to use.

    jlincoln wrote:
    "Do you mean "export" and "import" colloquially, or in the specific sense of the exp/imp/datapump utilities?"': I mean as in imp/exp Oracle utilities. Generally speaking, it would be neat to be able to export and import via an Apex an application. In this hosted environment I don't have that access but would this be a bad idea if you don't care about the existing data in the schema in which the data resides?I can envisage a mechanism using <tt>exp/imp</tt>, but since it requires <tt>dbms_scheduler</tt> external jobs and access to the file system it's highly unlikely to be possible in a hosted environment. (Unless you're doing the hosting?)
    Backup: Necessary for piece of mind and flexibility. I am working on a VB/Access user who does this today to get to the point when they can be comfortable with the backups occurring regularly and by the hosting site's DBA group.
    Restore: Like I said. I am working on a VB/Access user who does this today to get to the point when they can be comfortable with the backups occurring regularly and by the hosting site's DBA group. This is a very small data set. A restore would simply remove existing data and replace it with the new data.My opinion is that time would be better spent working on the user rather than a redundant backup and restore feature. Involve them in a disaster recovery exercise with whoever is hosting the environment to prove that their data is safe. Normally the inclusion of data in regular, effective database backups is sold as a major feature of APEX solutions.
    "What about security/privacy when this data ends up in uncontrolled environments?": I don't understand the point of this question. The data should not end up in uncontrolled environments. Just like the data in the database or its backups.Again, having data in a central, shared location protected by multiple levels of application, database, and OS security is usually seen as a plus for APEX over VB/Access. Exporting the data in toto to a PC/laptop that can be stolen or lost, and where it can be copied to USB drives/phones/email loses this protection.
    User Level: Because the end user must have access to the backup and restore mechanisms of the application.
    Application Data: The application data. Less than 10MB. Very small. It can be exported in a flat file downloaded by the end user. This file can then be used to upload and import via an existing application interface. For example.
    "I'm struggling to parse this for meaning.": When I say I have an existing interface I am referring to a program residing in the Apex application that will take data from a flat table structure (i.e. interface table), validate the data, derive data, and load into the target table structure.Other than the report export capability linked to above, there's nothing built-in to APEX that comes close to your requirement. If the data is simple enough that it can be handled in such a report, and you have a process that can read and recreate this export, then you have your backup/restore capability. If the data can't be handled in a simple report, then you'll need a more complex PL/SQL process to generate the file.

  • Input level - meter?

    This might seem like a stupid question, but I need to ask it as the GB help doesn't explain it clearly. Where should one set input level for recording and where does one MONITOR that level via a meter reading (to check for clipping, etc)? This seems like such a basic need, so I'm confused to why it's not clearer.
    I'm using an M-Audio FW410 which has it's own software mixer, but from their manual it seems the input level should be set from within GB.
    I've experimented with the Recording Level slider in GB and even when it's not greyed out I'm not hearing a difference in signal, even if I turn it to 0. Do I actually have to go to the Sound control panel in System Preferences for this? I should mention I'm recording a "real instrument" track and my signal is coming through loud and clear.
    The only two level meters I see from within GB are for each track's volume (which is just for mixing and/or monitoring while recording, correct?) and the master volume slider/meter at the bottom.
    thanks,
    Scott

    The meter on each input should be reading at a sensible level: not over-peaking but not too low either. The main control has no effect on these levels, only on the final mix. For example, if the main control was at a normal setting, and you had a number of channels all at a high level, then they would add together and you would need to bring the final level down to prevent it over-peaking. The level should be set sensible at every stage.
    You set the input in the GarageBand Preferences, so this will show what you are using. If you have your interface selected then the internal mic is not in use.

  • Input Query Data disappears

    Hi,
       I have created an Aggregate Level on Multiprovider and restricted in the query as follows:
       Currency Key (USD)
       Posting Period (#)
       InfoProvider
       Actual Amt(RKF by Actual Infoprovider) - Read Enabled
       Plan Amt(RKF by Plan Infoprovider) - Plan Enabled
       Input Query starts in change mode
    When I execute the query in Bex Analyzer, I can see data in Actual Amt RKF and enter data in Plan Amt RKF.When I save the data it disappears.For save I am using SAVE_AREA command.Any ideas on what is missing in here.
    Before I can make my selections complex, I wanted to try this simple one.Also, I can't execute this query from query designer ..some problem with basis setting, I hope that isn't the reason.I got this working when I created the input query on real time provider rather than multiprovider.But I want to make it work with multiprovider as I need to show data from actuals and plan.
    Thanks
    RT

    Ravi,
      You tell the system within your RKF by selecting real time infoprovider in addition to the amount.I deleted what I had created earlier to have a clean start.Here are the steps I followed to re-create: (we are working with 0amount, posting period, currency key and infoprovider objects only)
    1.Created 2 restricted key figures (RKF1 - 0amount, actual cube and RKF2 - 0amount, plan cube)
    2. RKF1 planning properties data cannot be changed and RKF2 planning properties data can be changed by user or planning functions.
    3. Created structure in columns and added these two RKF.
    4. Added Posting Period to rows and changed display to key.Acess type to Master data and query execution to char rels.
    5.Added Currency to rows and changed display to No Display.
    6.In the default values, set posting period to 1and set currency key to USD.
    7.Added Infoprovider to free char.
    8.Added both posting period (1) and currency key(usd) to char restrictions.
    9.Opened Bex Analyzer and created the save command.
    10. Executed the query, input the data and data disappears.
    Cheers
    RT

  • Export section of data from ASO

    Is there a way to export only a certain portion of data from an ASO cube?
    There is an export function in the console, but it exports all level 0 data. For example, I'd like to export just all the January 2007 level 0 data.
    Thanks!

    I was trying to export a section of data so that I could change it to #Missing and re-upload to "erase" those intersections. I wrote report scripts but the data sets were too large and were taking hours to render.
    After much researching, I came up with a simple solution. The data that I was needing to change to #missing was always time-related. For example, I needed to delete all the data for Week 32 in 2008. To accomplish this I simply deleted Week 32 out of the time dimension, the re-added it after the restructure of the data. Data cleared!

  • Export photo creation date

    How does one preserve the original creation date on photos exported from iPhoto?  My wife and I use different cameras and want to integrate our pictures.  The only way to do so is by creation date.  But when I export my photos from iPhoto the creation date of the photo is changed to the date I export the photo, not the date I took it.  This should not be.    Is there a way to fix it?

    There are two kinds of metadata involved when you consider jpeg or other image file.
    One is the file data. This is what the Finder shows. This tells you nothing about the contents of the file, just the File itself.
    The problem with File metadata is that it can easily change as the file is moved from place to place or exported, e-mailed, uploaded etc.
    Photographs have also got both Exif and IPTC metadata. The date and time that your camera snapped the Photograph is recorded in the Exif metadata. Regardless if what the file date says, this is the actual time recorded by the camera.
    Photo applications like iPhoto, Aperture, Lightroom, Picasa, Photoshop etc get their date and time from the Exif metadata.
    When you export from iPhoto to the Finder new file is created containing your Photo (and its Exif). The File date is - quite accurately - reported as the date of Export.
    However, the Photo Date doesn't change.
    The problem is that the Finder doesn't work with Exif.
    So, your photo has the correct date, and so does the file, but they are different things. To sort on the Photo date you'll need to use a photo app.

Maybe you are looking for