BEA Custom Data Transfer and non-BEA Producer Portals?

Hello,
I recently had a problem where my BEA 10 Consumer was not sending my BEA 10 Producer the user's x509 digital certifcate used in authentication. In a non-WSRP environment, the certs come over in the HttpServletRequest object. I was told that for WSRP, BEA strips things out of the request object and the cert was one of those things. The solution was to implement BEA's Custom Data Transfer. I was able to create a solution for our problem using this approach. All is well.
Now I have run into a problem where one of our other projects is using a non-BEA portal and wants to also be a Producer. Since the Custom Data Transfer is a BEA solution (the Producer portlet class imports BEA classes), how can a non-BEA Producer receive the user's digital certificate since they presumedly can't implement the BEA Custom Data Transfer (unless, I guess, they bring in some BEA library file with those particualr classes).
Any insight? Thanks - Peter Len

Peter,
What producer are you using, some other vendors have implemented BEA's
custom data transfer. Also, the WSRP committee is looking into
standardizing this.
Additionally, you may use an interceptor (9.2 and later) to set HTTP
headers that the producer may be able to use.
Nate
Peter Len wrote:
Hello,
I recently had a problem where my BEA 10 Consumer was not sending my BEA 10 Producer the user's x509 digital certifcate used in authentication. In a non-WSRP environment, the certs come over in the HttpServletRequest object. I was told that for WSRP, BEA strips things out of the request object and the cert was one of those things. The solution was to implement BEA's Custom Data Transfer. I was able to create a solution for our problem using this approach. All is well.
Now I have run into a problem where one of our other projects is using a non-BEA portal and wants to also be a Producer. Since the Custom Data Transfer is a BEA solution (the Producer portlet class imports BEA classes), how can a non-BEA Producer receive the user's digital certificate since they presumedly can't implement the BEA Custom Data Transfer (unless, I guess, they bring in some BEA library file with those particualr classes).
Any insight? Thanks - Peter Len

Similar Messages

  • Custom Data Transfer in WSRP Interceptor Framework

    Hi,
    I'm trying to pass custom data (XML data) between the consumer and producer(WLP 10.3) in Interceptor Framework.
    This is what I'm trying to do:
    public Status.PreInvoke preInvoke(IGetMarkupRequestContext requestContext) {
    root = (Element) document.createElement("Test1");
    document.appendChild(root);
    root.appendChild(document.createTextNode("111"));
    xp = new XmlPayload(root);
    requestContext.setMarkupRequestState(xp);
    What if I want to send another XML data "Test2"? Any pointers will be helpful?
    Thanks.

    Hello,
    The WSRP custom data transfer implementation allows only a single XML root element to be sent as the custom data, but that root custom data element can contain as many child elements as you want. So you could do it something like this:
    Element root = (Element)document.createElement("CustomData");
    // Add first bit of custom data
    Element test1 = (Element) document.createElement("Test1");
    root.appendChild(test1);
    test1.appendChild(document.createTextNode("111"));
    // Add second bit of custom data
    Element test2 = (Element) document.createElement("Test2");
    root.appendChild(test2);
    test2.appendChild(document.createTextNode("222"));
    XmlPayload xp = new XmlPayload(root);
    requestContext.setMarkupRequestState(xp);Kevin

  • Re: difference between data transfer and data upload

    Hi,
    I want to know the difference between data transfer and data upload.
    Regards
    Sasi

    Hi,
    Data Transfer:  It means  data tranfer from SAP System to  Non SAP System
    Data Upload:It means data transfer from Non SAP  system to SAP system.
    Regards,
    kavitha....

  • MyRIO memory, data transfer and clock rate

    Hi
    I am trying to do some computations on a previously obtained file sampled at 100Msps using myRIO module. I have some doubts regarding the same. There are mainly two doubts, one regarding data transfer and other regarding clock rate. 
    1. Currently, I access my file (size 50 MB) from my development computer hard drive in FPGA through DMA FIFO, taking one block consisting of around 5500 points at a time. I have been running the VI in emulation mode for the time being. I was able to transfer through DMA from host, but it is very slow (i can see each point being transferred!!). The timer connected in while loop in FPGA says 2 ticks for each loop, but the data transfer is taking long. There could be two reasons for this, one being that the serial cable used is the problem, the DMA happens fast but the update as seen to the user is slower, the second being that the timer is not recording the time for data trasfer. Which one could be the reason?
    If I put the file in the myRIO module, I will have to compile it each and every time, but does it behave the same way as I did before with dev PC(will the DMA transfer be faster)? And here too, do I need to put the file in the USB stick? My MAX says that there is 293 MB of primary disk free space in the module. I am not able to see this space at all. If I put my file in this memory, will the data transfer be faster? That is, can I use any static memory in the board (>50MB) to put my file? or can I use any data transfer method other than FIFO? This forum (http://forums.ni.com/t5/Academic-Hardware-Products-ELVIS/myRIO-Compile-Error/td-p/2709721/highlight/... discusses this issue, but I would like to know the speed of the transfer too. 
    2. The data in the file is sampled at 100Msps. The filter blocks inside FPGA ask to specify the FPGA clock rate and sampling rate, i created a 200MHz derived clock and mentioned the same, gave sampling rate as 100Msps, but the filter is giving zero results. Do these blocks work with derived clock rates? or is it the property of SCTL alone?
    Thanks a lot
    Arya

    Hi Sam
    Thanks for the quick reply. I will keep the terminology in mind. I am trying analyse the data file (each of the 5500 samples corresponds to a single frame of data)  by doing some intensive signal processing algorithms on each frame, then average the results and disply it.
    I tried putting the file on the RT target, both using a USB stick and using the RT target internal memory. I thought I will write back the delay time for each loop after the transfer has occured completely, to a text tile in the system. I ran the code my making an exe for both the USB stick and RT target internal memory methods; and compiling using the FPGA emulater in the dev PC VI. (A screenshot of the last method is attached, the same is used for both the other methods with minor modifications. )To my surprise, all three of them gave 13 ms as the delay. I certainly expect the transfer from RT internal memory faster than USB and the one from the dev PC to be the slowest. I will work more on the same and try to figure out why this is happening so.
    When I transferred the data file (50MB) into the RT flash memory, the MAX shows 50MB decrease in the free physical memory but only 20MB decrease in the primary disk free space. Why is this so? Could you please tell me the differences between them? I did not get any useful online resources when I searched.
    Meanwhile, the other doubt still persists, is it possible to run filter blocks with the derived clock rates? Can we specify clock rates like 200MHz and sampling rates like 100Msps in the filter configuration window? I tried, but obtained zero results.
    Thanks and regards
    Arya
    Attachments:
    Dev PC VI.PNG ‏33 KB
    FPGA VI.PNG ‏16 KB
    Delay text file.PNG ‏4 KB

  • Azure Site Recovery to Azure - cost for data transfer and storage

    Hello,
    I send you this message on behalf of a small firm in Greece interested to implement Azure Site Recovery to Azure.
    We have one VM (Windows 2008 R2 Small Business Server) with 2 VHDs (100GB VHD for OS and 550GB VHD for Data) on a Windows 2012 server Std Edition.
    I would like to ask you a few questions about the cost of the data transfer and the storage 
    First: About the initial replication of the VHDs to Azure. It will be 650GBs. Is it free as inbound traffic? If not the Azure Pricing calculator shows about 57€. But there is also the import/export option which costs about the same:
    https://azure.microsoft.com/en-us/pricing/details/storage-import-export/
    What would be the best solution for our case? Please advice.
    Second: What kind of storage is required for the VHDs fo the VM (650GBs). My guess is Blob storage. For this storage locally redundant, the cost will be about 12-13€/month. Please verify.
    Third: Is the bandwidth for the replication of our VM to Azure free?
    That's all for now.
    Thank you in advance.
    Kind regards
    Harry Arsenidis 

    Hi Harry,
    1st question response: ASR doesn't support Storage Import/Export for seeding the initial replication storage. ASR pricing can be found
    here which details about 100GB of Azure replication & storage per VM is included with the purchase of the ASR to Azure subscription SKU through the Microsoft Enterprise Agreement. 
    Data transfer pricing
    here  indicates that inbound data transfers are free.
    As of now only option will be online replication. What is the current current network link type & bandwidth to Azure? Can you vote for the feature & update requirements here?
    2nd question response: A storage account with geo-redundancy is required. But as mentioned earlier with Microsoft Enterprise Agreement you will get 100GB of Azure replication & storage per VM included with ASR. 
    3rd question response: Covered as part earlier queries.
    Regards, Anoob

  • How to create data transfer and copying requirements

    Hi Gurus,
    Can you tell me how to create data transfer and copying requirements in copying control for SD?
    Thanks,
    pAUL

    Hi John,
    Go to Transaction code - VOFM.
    In the menu select Data Transfer and select where you want maintain the copying requirements i.e. at the order level, delivery level etc.
    Here, you can either create a new routine or change the existing routine according to your requirement.
    Make sure you activate the routine (click on Edit - Activate from menu) once you are done with the routine.
    REWARD POINTS IF HELPFUL
    Regards
    Sai

  • Why can't I find "Data Transfer and Memory " option in channel property node?

    Hi All,
    I'm trying to find some properties such as "Data Transfer and Memory " in my "DAQmx Channel" property node, as below
    But, in my Labview 8.5, I can't find it. I re-installed NI-DAQmx, but it didn't work. What I can get is
    Did I miss something? Thank you.
    Regards,
    Bo
    Message Edited by foolooo on 10-23-2008 07:02 AM
    My blog Let's LabVIEW.
    Solved!
    Go to Solution.
    Attachments:
    temp.JPG ‏11 KB

    Hi Andrey,
    Thank you for your reply. But I tried this on another computer on which lv 8.6 was installed, and it worked. I just simply dropped the channel property node, and didn't connect with any channel vi, and there were available options on that one. But for this pc, it doesn't work. I also configured the hardware in Mea & AE, it didn't help.
    btw, my labview is full licenced.
    Best wishes,
    Bo
    My blog Let's LabVIEW.

  • Diff between init with data transfer and repair full request

    hi,
    i have observed that even in the new flow we are doing init without data transfer and then repair full request
    if i do init with data transfer also i can achieve the same?
    i want to know why we need to do this ,do we have any advantage of doing init without transfer and repair full request?
    please suggest me

    Hi Venkat,
    A repair full request is loaded in cases where you get erroneous records or where there are missing records which you want to load. In such cases, a repair full request for the selective records can be loaded and is the most efficient way to correct the data, since you won't be required to load the entire data once again. Also you acheive the desired result without disturbing the delta. (You are not required to do an init w/o data transfer after a repair full, people just do it a a precaution)
    However, repair full requests should only be loaded to infoproviders with update mode 'overwrite'. Otherwise, if the InfoProvider, its very probable that you might double the value of the key-figures due to rows being added twice - in case of an InfoProvider with update mode 'Additive'. So, if your InfoProvider is additive, you will need to delete the entire data and do an 'init with data transfer' to avoid corrupting the data. (Note: you can do a repair full request for an additive infoprovider in case of lost records or if you can delete erroneous records with selective deletion.But you have to be careful with the selections lest you inadvertently load other records than required and corrupt the data)

  • Solution for Data transfer from Non-network location

    If i  needed 10 non-networked locations around Australia to submit data (say text files ranging from 20 - 100 mb each) each day to my server, and the data loaded and manipulated into a database in a secure environment, would you be able to describe the key points and a possible solution to this problem ?

    Hi Amar,
    FTP [File Transfer Protocol] alone isnu2019t a viable option to give the insight, security, performance, and, ultimately, the risk mitigation necessary to responsibly conduct business.One of the solution to protecting and transferring sensitive or mission-critical data securely is Managed File Transfer (MFT). Managed File Transfer solutions provide a greater level of security, meet strict regulatory compliance standards and give you the reliability you need in a data transfer solution.
    Points to look for in a solution:
    a.  Providing file transfer transparency throughout your entire organization.
    b. Solution should support the most modern security standards and methodology including SSL encryption, X.509 certificates and proxy certificates.
    c. The solution should streamline the audit process while also being able to access that audit information from a central point, saving you time and money.
    d. Solution should include functionality that allows data to be pre-and post-processed.
    e.  Solutions should ensure that all interrupted file transfers resume where they left off after a connection failure without manual intervention.
    f. Should tightly integrate with your existing job scheduling solution to issue alerts if connections are not re-established after an acceptable time interval.
    g. Should adhere to current security and audit requirements including SOX, GLBA and HIPAA.
    Hope this helps.
    Regards,
    Jimmy

  • Init with out data transfer and DELTAS

    Dear Gurus,
    I created Generic data source using Function module on FAGLFLEXT table. Here i used TimeStamp for capturing DELTAS. In the first run while loading into PSA i selected INIT WITH DATA TRANSFER then it got failed. Again i deleted INIT Request in Schedular and RUN INIT WITH OUT DATA TRANSFER then it got Sucess. Again i run Delta then it also it got failed. Again i deleted the INIT Request and out FULL LOAD. IT got success. May i know why MY Deltas are getting Failed.even in R/3 side when i checked in RSA7 then in status column and in Current Status Tab it showing time stamp field like this . . : :.
    I got Stuck up I dont know what to do............
    Can one kindly give solution for this.
    Thanks,
    Sony

    Debug your generic DataSource, using RSA3, to determine why the delta failures are occurring. It's most likely in the delta logic portion of your custom Function Module.

  • Is it mandatory to use Business object for data transfer and work flow?

    <font size="3">
    <pre>
    In our enterprise projects we deal with some screens where we add or update some data and again
    we retrieve that data and display to user in various ways like displaying on screen as
    a report or viewing in excel etc.. Most of the times, though we identify the business objects
    based on the nouns in requirements document, really while implementing there will not be any
    business logic in the objects. Almost all the objects which we identfy as business objects
    may have methods like add, delete, update and retrieve methods. Still in these cases, is it required to
    have business objects or can we use transfer object to send the data for saving and retriving or any other
    pattern/guideline is there for dealing such cases?
    I really appreciate your comments on this topic.
    Also I apologize ahead of time if my explanation is not clear to you.
    </pre>
    </font>
    <p>
    Thanks in advance.
    <p>
    Regards,
    <p>
    Rizwan.

    In my opinion, DAO pattern would be suitable for you. Generally the DAO (Data Access Object) will have the CRUD (create, read, update, delete) methods to retrieve data from data-source (e.g. database). Based on the data it will create the DTOs (Data Transfer Objects) and pass them to the caller. It will also receive the DTOs from the caller and save it the data-source. Thus in your case you can
    - remove the CRUD methods from the Business Objects (BO) and make them pure DTOs and use them with DAOs (If you are using JDBC codes inside your application and you don't have much of validation or processing logic in your BOs).
    http://www.corej2eepatterns.com/Patterns2ndEd/DataAccessObject.htm
    OR
    - use BOs in combination in DAOs (if using the database connection from a pool and BOs are having complicated processing logic).
    http://www.corej2eepatterns.com/Patterns2ndEd/BusinessObject.htm

  • Source for Init with data transfer and full load

    Hi Experts,
    1. Please tell me where the data comes when we do the
        following actions :-
        a) Init with data transfer
        b) Full load
      i want to know the source of data i.e (setup table or main table or any other table)
    Helpful if you can provide the data-flow.
    kindly tell which is prefrable and why?
    Regards,
    RG

    Hi.......
    When you do init with data transfer it will read data from the set up table only..........but beside this it will also set the init flag..........due to this from next time any new records will go to the delta queue...........and next time when you run delta load it will pick records from delta queue...........
    Now suppose data are alrady available in the BW side...........then you can run init without data transfer.......this will only set the init flag.........it will not pick any records.........
    Extraction structure gets data from set up table in case of full upload ............and in case of delta load from delta queue.......
    Regards,
    Debjani...........
    Edited by: Debjani  Mukherjee on Sep 20, 2008 11:45 AM

  • How to deal with Portal, Workshop, and non-BEA software?

    Hi,
    I'm really new to working with portal, and have looking into integrating a single sign-on software (RSA ClearTrust) into existing portal code.
    The ClearTrust installation involves installation of a bunch of software that integrates with the portal software, but I found that if the portal code was developed with Workshop, that if Workshop is run again after the ClearTrust integration has been done, that Workshop appears to overwrite some of the XML configuration files (e.g., web.xml).
    The result of this is that the WebLogic server won't start.
    So, I was wondering if anyone here has had to deal with this kind of situation, not necessarily with ClearTrust specifically, but maybe with other products?
    I'm not looking forward to re-doing the integration everytime someone makes some portal code changes :(...
    Thanks in advance,
    Jim

    There are really two types of Graphic Styles - Object level styles and Group level styles. There is no way to tell which style is which. So if you have an Object level Style and try to apply it to a group of objects or often a compound shape, you get each individual object having the style. If you have a Group level style and try to apply it to only an object, you often get nothing applied.
    It perhaps sounds like you have an Object level style and are trying to apply it to a group.
    There's no way to convert an Object level style to a Group level style or vice versa. The best (and pretty much only) way I've found to get around this is to apply the style to teh correct level then make a note of all the style settings, select the other level and recreate the style for that level.
    Whether or not a style applies correctly has a great deal to do with specific aspects of the style in addition to what level the style was generated from. This is a frustrating aspect of Graphic Styles.

  • Delivery document data transfer to non sap server.

    Hi,
    My requirement is ,when i created delivery label (vl01), this delivery document should be  sent to FTP server instead of printer in flat file format.
    Normally for delivery document printing we use VL71 t.code,but in my requirement i  have to create new program and configure this program in NACU tcode for particular output type.
    How we approach this scenario, please help me.(for data down load to flat file and transfer to server).
    Thanks in advance.

    Hi,
    I think you are mixing different concepts. UD connect , DB connect and Web services are methods to upload data from non-SAP source systems to BW. Ud connect is to loada data from SQL tables , DB connect from oracle table.
    If you want to transfer data to other systems Open hub is the easiest way. You can put the data in a flat table throught the open hub. You can fill the table with delta method.
    Open hub allows to load data from, infocubes, dso, or infoobjects.
    I hope it helps.

  • Linked Server Problem while SQL Data Transfer using Non-sys Admin Account

    Hi Team,
    I've an issue while transferring the data from ServerA to ServerB. I've made the ServerB as "Linked Server" to Server A. Pre requisite like:
    1) SQL Account is been created on ServerB.
    2) Timeout settings for remote connections is set to "No Timeout".
    When I execute the below script using Query Window, It executes successfully:
    Insert into ServerB.DBName.dbo.TableName1
    Select * from dbo.TableName1
    But when I execute the same step by creating a SQL Job, it fails with the below error message:
    Message
    Executed as user: DomainName\UserName. Named Pipes Provider: Could not open a connection to SQL Server [1450]. [SQLSTATE 42000] (Error 1450)  OLE DB provider "SQLNCLI" for linked server "ServerB" returned message "Login
    timeout expired". [SQLSTATE 01000] (Error 7412)  OLE DB provider "SQLNCLI" for linked server "ServerB" returned message "An error has occurred while establishing a connection to the server. When connecting to SQL Server 2005,
    this failure may be caused by the fact that under the default settings SQL Server does not allow remote connections.". [SQLSTATE 01000] (Error 7412).  The step failed.
    Could you please help me in fixing the above error message and I can transfer the SQL Data between 2 Servers.
    Thank You
    Sridhar D Naik

    Sridhar,
    Is this still an issue?
    Thanks!
    Ed Price, Azure & Power BI Customer Program Manager (Blog,
    Small Basic,
    Wiki Ninjas,
    Wiki)
    Answer an interesting question?
    Create a wiki article about it!

Maybe you are looking for

  • Why does itunes freeze up windows/PC but work fine on mac?

    Yes itunes is apple but why make it so bad on PC and fine on mac? not everyone has the money for mac. And school wise windows is much easier than mac. One of my friends is stupid enough to rely on pages for everything but when we need to use excel or

  • Problem with Edit IN function between LR 2.3 and PS CS3

    I recently upgraded my system to a Windows Vista 64 bit Core 2 Quad with 8 GB.  Lightroom now works and loads images the way it should. I do have one major concern and need any help possible. I am using LR 2.3 in 64 bit and PS CS3 ver 10.0.1 (which i

  • Just bought a Mac Pro...I have SO many questions!

    Ok, so I ordered a healthy version of the Mac Pro this morning. 250 GB HD, ATI x1900 card, 2 GB Ram, bluetooth/airport, 2.66GHz etc. Now I have a few questions since I want to know the tech side of things more clearly than I do. 1) RAID: I'm studying

  • Viewing Pdf files on IOS 8 device

    Having created pdf files on my mac I now want to view them on my iPhone and iPad. I've tried importing them directly into ibooks on the mac but they don't show up on the iPad. I've tried saving them in the iCloud drive folder but Pages doesn't seems

  • My Shuffle 2g will not charge through USB on laptop or through wall adapter

    Hi, I cannot get iTunes or my pc to recognize my iPod, so it obviously won't charge. I tried charging through a wall outlet and that's not working either. I've tried turning it off for 5 seconds and then turning on just as i was directed in troublesh