Read file and post to MQ

Hi,
I have a senario:
* Read text file from a directory.
* Post the content to MQ.
I dont have any schema for the content of the txt file.
So I used the opaque type in my FileAdapter while reading the text file and posted the data to MQ by using the MQAdapter.
But since content is opaque, data in the MQ is not understandable.
So I want to understand , how I can send the text data to MQ so that it can be understood.

Hi All,
I found the solution for the error which I was mentioning.
I had to import below classes to make it work.
<bpelx:exec import="java.util.*"/>
<bpelx:exec import="java.lang.*"/>
<bpelx:exec import="java.math.*"/>
<bpelx:exec import="java.io.*"/>
<bpelx:exec import="oracle.soa.common.util.Base64Decoder"/>
After importing it my compilation problem is solved.
Now I have new problem :)
I have written below code in Java Embedding for the purpose of decoding
try
String messagePayload =(String)getVariableData("inputMessage");
// decode the message
byte[] decodedBytes = Base64Decoder.decode(messagePayload.getBytes());
String decoded = new String(decodedBytes);
setVariableData("tempBase64EncodedData",decoded);
catch(Exception e)
addAuditTrailEntry(e.toString());
But after this if, I am trying to copy message from "tempBase64EncodedData" to any other variable, it is coming as Empty.
So, my question is
* Am I doing correct thing to assign the decoded message into variable called "tempBase64EncodedData" ?
or
* Decoding iteslef is having some issues ?

Similar Messages

  • Reading files and converting into xml structure

    Hi,
    In my application a client requests for the folder structure information to a server through RMI. The server needs to read files and folders on local machine and convert it into some structure (I am thinking of using xml) and send it back. For eg: I am planning to have Server send back something like:
    <directory name = "parentdirectory">
    <file name = "abc.jpg"/>
    <file name = "def.bmp"/>
    <directory" name = "subdirectory">
    <file name = "hij.jpg"/>
    <file name = "klm.bmp"/>
    </directory>
    </directory>
    It is just the names of the files I am interested in and not the contents. Is this a good approach of sending back the data as a string containg xml definition. Is there any better appproach in terms of performance, memory etc? I am currently planning on using DOM for construction of this structure. Is there a source code for reading and converting the folder structure into xml. Just for your information, the clients gets this information and shows it as a tree structure on the GUI.
    Thanks!!!!

    Is this a good approach of sending back the data as a string containg xml definition. It'll work.
    An alternative, more direct approach is to build a memory representation and send this as argument/return value of an RMI call. You'd need to write classes MyDirectory and MyFile; MyFile has just a name; MyDirectory has a name and a collection of MyDirectory and one of MyFile. Make these classes implement Serializable and you can send them over RMI.
    The effort to write those trivial classes would be less than to implement XML encoding/decoding, and also in terms of runtime performance and memory it will be hard to beat Java's serialization with anything XML-based. In this case I doubt performance/memory are relevant considerations though.
    If for some reason I'd go for sending XML Strings anyway, I wouldn't do the encoding/decoding myself; I'd use XStream to convert Java classes to/from XML and still end up writing the above two classes and be done.
    Sorry if you wanted a simple yes or no :-)

  • Read parked and posting document from one FM

    Hi all,
    IS theere any FM avaliable to read parked and posting document from one function module.
    Please let me know if anything is there.
    Thanks

    Hi T,
    have u searched in SE37 with words park?,i'm sure we have few FMs which will do the same.
    regards
    Prabhu

  • I just stared having problems with importing files from nikon D810 into LR 5.7 it pop a window saying it can not read files on working on a imac 27" running yosemite on my mac pro after a few times it finally was able to read files and import them into LR

    I just stared having problems with importing files from nikon D810 into LR 5.7 it pop a window saying it can not read files on working on a imac 27" running yosemite on my mac pro after a few times it finally was able to read files and import them into LR I never had this problem before was there some kind of update that could of cause this?

  • Read of XML file and post to IDOC

    Hi
    I'm working on a <b>WAS620</b> and need to read an XML file from a customer, extract the fields needed and post these via IDOC ORDERS01. My problem is HOW to read the XML file? Can anyone give me the steps involved/links to examples etc - I have not processed XML files via ABAP before.
    The file is posted to a shared folder and the ABAP I am about to develop will pick up this file.
    The file is <b>NOT</b> in IDOC/XML format but the customers own format
    Hope someone can help me asap.
    Thanks all in advance
    /Bo

    Hi,
    I would like to extend this question for <b>WAS620</b> and <b>reading</b> a <b>proprietary customer specific XML</b> file/data that is <b>send via HTTP to SAP WAS</b>.
    <b>Q1</b>: What is the best way to read this HTTP sent XML data (as it is, without transformations) into ABAP?
    <b>Q2</b>: What is the appropriate handler to use in the ICF object?
    Thanks all in advance

  • Use SkyDrive to upload collected files and post screen shot/picture. (Updated: 1/16/2012)

    To help community members troubleshoot issues efficiently, sometimes we need to collect related files (such as Event logs, Network traces, Setup
    log files, Screenshots, etc.) to perform a specific analysis. 
    We can simply use SkyDrive, which is a free storage on Windows Live, it’s easy to store and share your files and photos with almost anyone. To make the
    steps clear, I would like to share the detailed steps for:
    1. How to use  SkyDrive to upload collected files?
    2. How to post screenshots or other pictures in forum threads?
    1.  
    How to use
    SkyDrive to upload collected files? 
    1)   
    Open the
     SkyDrive site.
    2)   
    In the
    Sign In page, if you own a Windows Live ID, please type your Windows Live ID and Password to sign in; Otherwise, you may click Sign Up to register a new one. See the following screen shot.
    3)   
    After signing in Windows Live, click Create folder.
    4)     Name
    the folder and edit permissions to share with relevant person.
    Important Note: You
    can share the folder with Everyone if there is no private/sensitive information (such as the screen shot of a prompt error message). To so so, simply select the "Everyone (Public)" option in the "Share with" box. Please refer to the following screen shot:
    However, for private information that you do NOT want to be accessed by everyone, it is recommended to share with specified individuals. To do so, you can expand the "Share with" box, choose the "Select people..." option, and then type their
    Live IDs in the Individual box. Please refer to the screen shot below:
    5)   
    Drag the previously collected files directly to the Skydrive folder. For example:
    6)   
    Click Upload.
    7)   
    Click on the uploaded files and tell us the URL.
    2.  
    How to post screenshots or other pictures in forum threads?
    (This section updated on 1/16/2012 to reflect changes in Skydrive. Thanks to
    Jeeped on the Microsoft Answers forums)
    1)   Please
    use the above method to upload the picture files in advance. Then, in your Skydrive space, Right Click the picture and select
    View Original. Press Ctrl + A, or Right Click and select Copy, to select it.
    2)   
    Press Ctrl + C to copy this picture. 
    3)   
    Navigate to your post textbox, and press Ctrl + V, or Right Click and select Paste, to paste the picture there.
    Then, the picture will be successfully inserted in that thread.

    I cant even open windows now!!!! it crashes mecilessly here is a copy of the only dump file i have. i retreved it from second life and cant copy it onto the clip board.
    Problem signature:  Problem Event Name:BlueScreen  OS Version:6.1.7600.2.0.0.768.3  Locale ID:1033
    Additional information about the problem:  BCCode:124  BCP1:0000000000000000  BCP2:FFFFFA80050BE8F8  BCP3:0000000000000000  BCP4:0000000000000000  OS Version:6_1_7600  Service Pack:0_0  Product:768_1
    Files that help describe the problem:  C:\Windows\Minidump\073011-29484-01.dmp  C:\Users\madamediva\AppData\Local\Temp\WER-72805-0.sysdata.xml
    Read our privacy statement online:  http://go.microsoft.com/fwlink/?linkid=104288&clcid=0x0409
    If the online privacy statement is not available, please read our privacy statement offline:  C:\Windows\system32\en-US\erofflps.txt
    dump_pciidex.sysdump_pciidex.sys+216e730fffff880`01464000fffff8b0`014700000x000000300000c0000x4a5bc1137/13/2009 7:19:47 PMntoskrnl.exentoskrnl.exe+782e2fffff800`02a03000fffff800`02fe00000x005dd0000x4a5bc6007/13/2009 7:40:48 PMMicrosoft® Windows® Operating
    SystemNT Kernel & System6.1.7600.16385 (win7_rtm.090713-1255)Microsoft CorporationC:\Windows\system32\ntoskrnl.exeACPI.sysfffff880`00f46000fffff880`00f9d0000x000570000x4a5bc1067/13/2009 7:19:34 PMamdxata.sysfffff880`00c6c000fffff880`00c770000x0000b0000x4a12f2eb5/19/2009
    1:56:59 PMatapi.sysfffff880`00e59000fffff880`00e620000x000090000x4a5bc1137/13/2009 7:19:47 PMataport.SYSfffff880`00e62000fffff880`00e8c0000x0002a0000x4a5bc1187/13/2009 7:19:52 PMAtiPcie.sysfffff880`01fe8000fffff880`01ff00000x000080000x4a0054865/5/2009 11:00:22
    AMBATTC.SYSfffff880`00e1e000fffff880`00e2a0000x0000c0000x4a5bc3b57/13/2009 7:31:01 PMCI.dllfffff880`00d1a000fffff880`00dda0000x000c00000x4a5be01d7/13/2009 9:32:13 PMCLASSPNP.SYSfffff880`01800000fffff880`018300000x000300000x4a5bc11e7/13/2009 7:19:58 PMCLFS.SYSfffff880`00cbc000fffff880`00d1a0000x0005e0000x4a5bc11d7/13/2009
    7:19:57 PMcng.sysfffff880`01000000fffff880`010730000x000730000x4a5bc8147/13/2009 7:49:40 PMcompbatt.sysfffff880`00e15000fffff880`00e1e0000x000090000x4a5bc3b67/13/2009 7:31:02 PMcrashdmp.sysfffff880`01ff0000fffff300`01ffe0000xfffffa800000e0000x4a5bcabd7/13/2009
    8:01:01 PMdisk.sysfffff880`01fd2000fffff880`01fe80000x000160000x4a5bc11d7/13/2009 7:19:57 PMdump_dumpfve.sysfffff880`01235000fffff87f`012480000xffffffff000130000x4a5bc18f7/13/2009 7:21:51 PMdump_msahci.sysfffff880`01470000fffff87f`0147b0000xffffffff0000b0000x4a5bcabd7/13/2009
    8:01:01 PMfileinfo.sysfffff880`01145000fffff880`011590000x000140000x4a5bc4817/13/2009 7:34:25 PMfltmgr.sysfffff880`010f9000fffff880`011450000x0004c0000x4a5bc11f7/13/2009 7:19:59 PMFs_Rec.sysfffff880`0122b000fffff880`012350000x0000a0000x4a5bc1117/13/2009 7:19:45
    PMfvevol.sysfffff880`01f98000fffff880`01fd20000x0003a0000x4a5bc1a77/13/2009 7:22:15 PMfwpkclnt.sysfffff880`01400000fffff880`0144a0000x0004a0000x4a5bc1647/13/2009 7:21:08 PMhal.dllfffff800`02fe0000fffff800`030290000x000490000x4a5bdf087/13/2009 9:27:36 PMhwpolicy.sysfffff880`01f8f000fffff880`01f980000x000090000x4a5bc0fa7/13/2009
    7:19:22 PMkdcom.dllfffff800`00bd1000fffff800`00bdb0000x0000a0000x4a5bdfdb7/13/2009 9:31:07 PMkl1.sysfffff880`01830000fffff880`01f8f0000x0075f0000x4c0f985b6/9/2010 9:34:19 AMksecdd.sysfffff880`01200000fffff880`0121a0000x0001a0000x4a5bc1567/13/2009 7:20:54 PMksecpkg.sysfffff880`015d3000fffff880`015fe0000x0002b0000x4a5bc84a7/13/2009
    7:50:34 PMmcupdate.dllfffff880`00c9b000fffff880`00ca80000x0000d0000x4a5bdf657/13/2009 9:29:09 PMmountmgr.sysfffff880`00e3f000fffff880`00e590000x0001a0000x4a5bc11a7/13/2009 7:19:54 PMmsahci.sysfffff880`00ff0000fffff880`00ffb0000x0000b0000x4a5bcabd7/13/2009
    8:01:01 PMmsisadrv.sysfffff880`00fa6000fffff880`00fb00000x0000a0000x4a5bc0fe7/13/2009 7:19:26 PMmsrpc.sysfffff880`01165000fffff880`011c30000x0005e0000x4a5bc17c7/13/2009 7:21:32 PMmup.sysfffff880`01452000fffff880`014640000x000120000x4a5bc2017/13/2009 7:23:45
    PMndis.sysfffff880`01481000fffff880`015730000x000f20000x4a5bc1847/13/2009 7:21:40 PMNETIO.SYSfffff880`01573000fffff880`015d30000x000600000x4bbe946f4/8/2010 10:43:59 PMNtfs.sysfffff880`0124c000fffff880`013ef0000x001a30000x4a5bc14f7/13/2009 7:20:47 PMpartmgr.sysfffff880`00e00000fffff880`00e150000x000150000x4a5bc11e7/13/2009
    7:19:58 PMpci.sysfffff880`00fb0000fffff880`00fe30000x000330000x4a5bc1177/13/2009 7:19:51 PMPCIIDEX.SYSfffff880`00c5c000fffff880`00c6c0000x000100000x4a5bc1147/13/2009 7:19:48 PMpcw.sysfffff880`0121a000fffff880`0122b0000x000110000x4a5bc0ff7/13/2009 7:19:27 PMPSHED.dllfffff880`00ca8000fffff880`00cbc0000x000140000x4a5be0277/13/2009
    9:32:23 PMMicrosoft® Windows® Operating SystemPlatform Specific Hardware Error Driver6.1.7600.16385 (win7_rtm.090713-1255)Microsoft CorporationC:\Windows\system32\PSHED.dllPxHlpa64.sysfffff880`01159000fffff880`01164e000x0000be000x4a4162536/23/2009
    7:16:35 PMrdyboost.sysfffff880`010bf000fffff880`010f90000x0003a0000x4a5bc48a7/13/2009 7:34:34 PMspldr.sysfffff880`0144a000fffff880`014520000x000080000x4a0858bb5/11/2009 12:56:27 PMtcpip.sysfffff880`01602000fffff880`017ff0000x001fd0000x4bbe94e24/8/2010 10:45:54
    PMvdrvroot.sysfffff880`00fe3000fffff880`00ff00000x0000d0000x4a5bcadb7/13/2009 8:01:31 PMvolmgr.sysfffff880`00e2a000fffff880`00e3f0000x000150000x4a5bc11d7/13/2009 7:19:57 PMvolmgrx.sysfffff880`00c00000fffff880`00c5c0000x0005c0000x4a5bc1417/13/2009 7:20:33 PMvolsnap.sysfffff880`01073000fffff880`010bf0000x0004c0000x4a5bc1287/13/2009
    7:20:08 PMWdf01000.sysfffff880`00e93000fffff880`00f370000x000a40000x4a5bc19f7/13/2009 7:22:07 PMWDFLDR.SYSfffff880`00f37000fffff880`00f460000x0000f0000x4a5bc11a7/13/2009 7:19:54 PMWMILIB.SYSfffff880`00f9d000fffff880`00fa60000x000090000x4a5bc1177/13/2009 7:19:51
    PM

  • Best way to stream lots of data to file and post process it

    Hello,
    I am trying to do something that seems like it should be quite simple but am having some difficulty figuring out how to do it.  I am running a test that has over 100 channels of mixed sensor data.  The test will run for several days or longer at a time and I need to log/stream data at about 4Hz while the test is running.  The data I need to log is a mixture of different data types that include a time stamp, several integer values (both 32 and 64 bit), and a lot of floating point values.  I would like to write the data to file in a very compressed format because the test is scheduled to run for over a year (stopping every few days) and the data files can get quite large.  I currently have a solution that simply bundles all the date into a cluster then writes/streams the cluster to a binary file as the test runs.  This approach works fine but involves some post processing to convert the data into a format, typically a text file, that can be worked with in programs like Excel or DIAdem.   After the files are converted into a text file they are, no surprise, a lot larger than (about 3 times) the original binary file size.
    I am considering several options to improve my current process.  The first option is writing the data directly to a tdms file which would allow me to quicly import the data into DIAdem (or Excel with a plugin) for processing/visualization.   The challenge I am having (note, this is my first experience working with tdms files and I have a lot to learn) is that I can not find a simple way to write/stream all the different data types into one tdms file and keep each scan of data (containing different data types) tied to one time stamp.  Each time I write data to file, I would like the write to contain a time stamp in column 1, integer values in columns 2 through 5, and floating point values in the remaining columns (about 90 of them).  Yes, I know there are no columns in binary files but this is how I would like the data to appear when I import it into DIAdem or Excel.  
    The other option I am considering is just writing a custom data plugin for DIAdem that would allow me to import the binary files that I am currently creating directly into DIAdem.  If someone could provide me with some suggestions as to what option would be the best I would appreciate it.  Or, if there is a better option that I have not mentioned feel free to recommend it.  Thanks in advance for your help.

    Hello,
    Here is a simple example, of course here I only create one value per iteration in the while loop for simplicity. You can also set properties of the file which can be useful, and set up different channels.
    Beside, you can use multiple groups to have more flexibility in data storage. You can think of channels like columns, and groups as sheets in Excel, so you see this way your data when you import the tdms file into Excel.
    I hope it helps, of course there are much more advanced features with TDMS files, read the help docs!

  • Synchronous read file and status update

    Hi,
    CSV file to be loaded into the database table. ASP used as front end and ESB should be used for backend processing!
    A csv file is placed in a fileshare by an asp page. After that a button is clicked on the asp page which inserts a record into a database table(A) with the file details. There is a flag field in the table which will be set to 'F' initially.
    Now there should be an esb service which will be polling for any new records inserted into the database table(A).
    Once it finds any new record, the esb service should update the flag status to 'RF' from 'F' and read the records in the csv file(I think synchronous read file should be used inthe file adapter) and insert them into a database table(B). The flag field is updated to 'RF' so that the asp page informs the user that the csv file is being inserted into the table(B).
    After all the records are inserted, the esb service should once again update the flag in table(A) to 'Y' from 'RF' so that the asp page will indicate the user that the csv file is loaded into the data base table.
    The challenges that I have here are:
    1) I need to have the ESB service with such a sequence of execution that updates the table(A) at each stage so that the asp page can display the status to the USER.
    2) The csv file size can be huge. It can contain as many as 100 thousand records with over 50 fields in each record. I will have to probably debatch the file. Once we debatch, seperate instance would be created for each batch. Now we should see to that flag in table(A) is updated to 'Y' only when the last batch of the records are processed.
    I am using jdev10134 and oracle SOA suite ESB10134
    Someone please advice on how to go ahead.
    Cheers,
    RV

    Hi,
    CSV file to be loaded into the database table. ASP used as front end and ESB should be used for backend processing!
    A csv file is placed in a fileshare by an asp page. After that a button is clicked on the asp page which inserts a record into a database table(A) with the file details. There is a flag field in the table which will be set to 'F' initially.
    Now there should be an esb service which will be polling for any new records inserted into the database table(A).
    Once it finds any new record, the esb service should update the flag status to 'RF' from 'F' and read the records in the csv file(I think synchronous read file should be used inthe file adapter) and insert them into a database table(B). The flag field is updated to 'RF' so that the asp page informs the user that the csv file is being inserted into the table(B).
    After all the records are inserted, the esb service should once again update the flag in table(A) to 'Y' from 'RF' so that the asp page will indicate the user that the csv file is loaded into the data base table.
    The challenges that I have here are:
    1) I need to have the ESB service with such a sequence of execution that updates the table(A) at each stage so that the asp page can display the status to the USER.
    2) The csv file size can be huge. It can contain as many as 100 thousand records with over 50 fields in each record. I will have to probably debatch the file. Once we debatch, seperate instance would be created for each batch. Now we should see to that flag in table(A) is updated to 'Y' only when the last batch of the records are processed.
    I am using jdev10134 and oracle SOA suite ESB10134
    Someone please advice on how to go ahead.
    Cheers,
    RV

  • How to read file and output base64 string?

    im having a hard time reading binary file and converting it to base64.
    is there a parameter in extendscript like file.read('base64') ??

    It is possible in HTML:
    var path = "/tmp/test"; 
    result = window.cep.fs.readFile(path, cep.encoding.Base64);
    if (0 == result.err) {
         //success
         var base64Data = result.data;
         var data = cep.encoding.convertion.b64_to_utf8(base64Data);
    else {
         ...// fail
    You can also refer to the following samples for more examples:
    https://github.com/Adobe-CEP/Samples/tree/master/Flickr
    https://github.com/Adobe-CEP/Samples/tree/master/Collage

  • Read file and store the file into a string

    i want to read the file and copy it to a string.
    i wrote this code but don't know what to do next..
    please help me urgent.....
    File file_to_text = new File("c:\\wtgapp.xml");
    String wtgapp_string=new String();
    try
    BufferedInputStream stream = new BufferedInputStream(new FileInputStream(file_to_text));
    catch (FileNotFoundException file_error)
    JOptionPane.showMessageDialog(null, "FILE READ ERROR", "READ ERROR!",JOptionPane.INFORMATION_MESSAGE );
    System.exit(1);
    }

    BufferedReader reader = new BufferedReader(new FileReader(file_to_text));
    try
    while(reader.readLine() !=)
    wtgapp_string = wtgapp_string+ reader.readLine();
    System.out.println(wtgapp_string);
    i did this..
    but reader.readLine() != null
    didn't work.
    what shoud i write? to go to the end of file
    no it is 16.00pm in here i live in TURKIYE istanbul :)

  • Reading file and dump data into database using BPEL process

    I have to read CSV files and insert data into database.. To achieve this, I have created asynchronous bpel process. Added Filed Adapter and associated it with Receive activity.. Added DB adapter and associated with Invoke activity. Total two receive activity are available in  process, when tried to Test through EM, only first receive activity is completed, and waiting on second receive activity. Please suggest how to proceed with..
    Thanks, Manoj.

    Deepak, thank for your reply.. As per your suggestion I created BPEL composite with
    template "Define Service Later". I followed below steps, please correct me if I am wrong/missing anything. Your help is highly appreciated...
    Step 1-
    Created File adapter and corresponding Receive Activity (checkbox create instance is checked) with input variable.
    Step 2 - Then in composite.xml, dragged the
    web service under "Exposed Services" and linked the web service with Bpel process.
    Step 3 - Opened .bpel file and added the DB adapter with corresponding Invoke activity, created input variable. Web service is created of Type "Service" with existing WSDL(first option aginst WSDL URL).
    and added Assign activity between receive and invoke activities.
    Deployed the composite to server, when triedTest it
    manually through EM, it is promting for input like "subElmArray Size", then I entered value as 1 with corresponding values for two elements and click on Test We Service button.. Ptocess is completing in error. The error is
    Error Message:
    Fault ID
    service:80020
    Fault Time
    Sep 20, 2013 11:09:49 AM
    Non Recoverable System Fault :
    Correlation definition not registered. The correlation set definition for operation Read, process default/FileUpload18!1.0*soa_3feb622a-f47e-4a53-8051-855f0bf93715/FileUpload18, is not registered with the server. The correlation set was not defined in the process. Redeploy the process to the containe

  • RFC to read file and continue

    Hello,
    I have the following scenario:
    I have a report that needs to work with data saved in a CSV file in another server connected with XI.
    In my report I have to call a function to read de CSV file, wait, and continue when XI gives me the response.
    Is possible that scenaro? How can I built that? My RFC function is just an empty tables parameter that waits to be filled by XI. Now the problem is the XI. How can I read the file synchronous and fill the RFC response?
    Edited by: Marshal Oliveras on Apr 29, 2008 11:48 AM

    Hi,
    no, this is not possible. At least, not w/o further development. You can't use FILE adapter for this (does not support sync processing). What would be possible - develop a web service, which will be called, reads the file and returns it back to the function. So the whole process would look like this:
    You call some function in R/3, this sends request to PI, PI transforms the request and sends it to Web service. Web Service reads the file and returns it back to PI & PI sends this to the calling function.
    Peter
    Edited by: Peter Jarunek on Apr 29, 2008 11:51 AM

  • I downloaded a reader file and it screwed up my Safari browser and the app opens with the Goggle Quick search which I can't get rid of

    I downloaded a file to open a owners manual for a camera I needed. I had difficulty getting it down loaded and never could get the manual. I tried using my Safari browser and started getting this Goggle Quick Search file which opened in place of my Safari….I tried to close it and I can't. I did manage to get ride of the reader file.

    You may have installed one of the common types of ad-injection malware. Follow the instructions on this Apple Support page to remove it. It's been reported that some variants of the "VSearch" malware block access to the page. If that happens, start in safe mode by holding down the shift key at the startup chime, then try again.
    Back up all data before making any changes.
    One of the steps in the article is to remove malicious Safari extensions. Do the equivalent in the Chrome and Firefox browsers, if you use either of those. If Safari crashes on launch, skip that step and come back to it after you've done everything else.
    If you don't find any of the files or extensions listed, or if removing them doesn't stop the ad injection, ask for further instructions.
    Make sure you don't repeat the mistake that led you to install the malware. It may have come from an Internet cesspit such as "Softonic" or "CNET Download." Never visit either of those sites again. You might also have downloaded it from an ad in a page on some other site. The ad would probably have included a large green button labeled "Download" or "Download Now" in white letters. The button is designed to confuse people who intend to download something else on the same page. If you ever download a file that isn't obviously what you expected, delete it immediately.
    Malware is also found on websites that traffic in pirated content such as video. If you, or anyone else who uses the computer, visit such sites and follow prompts to install software, you can expect more of the same, and worse, to follow. Never install any software that you downloaded from a bittorrent, or that was downloaded by someone else from an unknown source.
    In the Security & Privacy pane of System Preferences, select the General tab. The radio button marked Anywhere  should not be selected. If it is, click the lock icon to unlock the settings, then select one of the other buttons. After that, don't ignore a warning that you are about to run or install an application from an unknown developer.
    Still in System Preferences, open the App Store or Software Update pane and check the box marked
              Install system data files and security updates
    if it's not already checked.

  • Question on CAPolicy.inf file and post-installation script

    I'm preparing a small PKI implementation with a single Enterprise Root CA on Windows 2008 R2 Enterprise.
    The primary role of this CA is to provide certificates for about 20 laptops that will use the certificates for authentication to a wireless network.
    I have prepared a CAPolicy.inf file and a post installation script (below).
    Renewal period for the root cert should be 10 years, CRL publication every 2 days with Delta publication every 12 hours (details in scripts below).
    I want to make sure the AIA and CRL url commands are correct.
    Does this look correct?
    AIA
    1:%WINDIR%\System32\CertSrv\CertEnroll\%%1_%%3%%4.crt
    This should publish the CA certificate to the local file system "certenroll".
    2:ldap:///CN=%%7,CN=AIA,CN=Public Key Services,CN=Services,%%6%%11
    This places the LDAP url in the AIA extension of issued certs.
    I am not planning to use HTTP, hence its absence.
    CRL
    1:%WINDIR%\System32\CertSrv\CertEnroll\%%3%%8%%9.crl
    This publishes the CRL to the local file system ("certenroll" subfolder).
    10:ldap:///CN=%%7%%8,CN=%%2,CN=CDP,CN=Public Key Services,CN=Services,%%6%%10
    Indicates CDP in AD DS and includes CDP url in issued certificates.
    Complete scripts
    1. CAPolicy.inf - %windir%
    [Version]
    Signature= "$Windows NT$"
    [certsrv_server]
    renewalkeylength=2048
    RenewalValidityPeriodUnits=10
    RenewalValidityPeriod=years
    CRLPeriod = days
    CRLPeriodUnits = 2
    CRLDeltaPeriod = hours
    CRLDeltaPeriodUnits = 12
    LoadDefaultTemplates=0
    2. Install Role
    Follow steps in GUI here
    3. Run post-install script
    certutil -setreg CA\DSConfigDN CN=Configuration,DC=mydomain,DC=local
    certutil -setreg CA\CRLPeriodUnits 2
    certutil -setreg CA\CRLPeriod "days"
    certutil -setreg CA\CRLDeltaPeriodUnits 12
    certutil -setreg CA\CRLDeltaPeriod "hours"
    certutil -setreg CA\ValidityPeriodUnits 10
    certutil -setreg CA\ValidityPeriod "Years"
    certutil –setreg CA\CACertPublicationURLs "1:%WINDIR%\System32\CertSrv\CertEnroll\%%1_%%3%%4.crt\n2:ldap:///CN=%%7,CN=AIA,CN=Public Key Services,CN=Services,%%6%%11"
    certutil –setreg CA\CRLPublicationURLs "1:%WINDIR%\System32\CertSrv\CertEnroll\%%3%%8%%9.crl\n10:ldap:///CN=%%7%%8,CN=%%2,CN=CDP,CN=Public Key Services,CN=Services,%%6%%10"
    certutil -setreg CA\csp\DiscreteSignatureAlgorithm 1
    certutil -setreg CA\AuditFilter 127
    net stop certsvc & net start certsvc
    certutil -crl
    Please mark as helpful if you find my contribution useful or as an answer if it does answer your question. That will encourage me - and others - to take time out to help you.

    A couple of things (just a quick glance)
    1) CAPolicy.inf
    Everything looks fine in this file. You will have a 2 day base crl and 12 hour delta CRLs. No default templates are loaded, so you must manually designate all published certificates
    2) Post install Script
    a. Best practices are to use HTTP only, not to use LDAP, so your whole design does not follow best practices. Furthermore, if you issue a certificate to a single, non-domain joined machine, they will be unable to evaluate your certificates. If you decide
    to do VPN (or use an internal SSL cert on an externally accessible Web server), revocation checking will fail because you do not publish AD to the external world
    b. Will you be issuing 10 year certificates from the CA. These lines:
    certutil -setreg CA\ValidityPeriodUnits 10
    certutil -setreg CA\ValidityPeriod "Years"
    define the maximum validity period of certs issued *by* the CA. For example, if a certificate template states 4 years, then a four year will be issued (less than 10 years). If a template state 12 years, then the max age will be 10 years. Think of it as a
    governor.
    c.CACertPublicationURLs . This should include an HTTP URL, and if you want to still include LDAP, HTTP should be prior to the LDAP URL.
    d. CRLPublicationURLs. This should include an HTTP URL. If you want to still include LDAP, HTTP should be prior to the LDAP URL. Your checkbox value for the LDAP URL is incorrect as well. For a CA that issues both base and delta CRLs, you must have a value
    of 79 (all check boxes enabled). Your current configuration would fail for revocation checking because you do not publish the base or delta CRL to AD. There is another check box missing (I believe the one to include the URL in the base CRL's freshest CRL extension
    (how to find the delta CRL).
    e. Do not include the DiscreteSignatureAlgorithm line in the script. If you have any Windows XP clients, they will not be able to use the certs (BTW, this should have been AlternateSignatureAlgorithm in my book - where you sourced your scripts). This would
    only be used on a CA where you used something like Elliptical Curve as the assymetric algorithm and SHA384 as the signature algorithm).
    f. The restart line should be net stop certsvc && net start certsvc. The double && ensures that the service comes to a complete stop before the start command is executed.
    HTH,
    Brian

  • HRRCF_MDL_CAND_ATT_CREATE - Read file and covert to 'content'

    Hi,
    We are trying to use FM u2018HRRCF_MDL_CAND_ATT_CREATEu2019 to upload attachment for Candidate (Candidate already in E-Recruiting system). This FM has import parameter u2018CONTENTu2019 (Data Type u2018RAWSTRINGu2019).
    I want to know how to read file (File type can be any) from drive and covert into u2018Contentsu2019. Is there any FM to do this?
    Regards,
    ...Naddy

    I'm getting type cpmpatiable error.
    This is my code,
    REPORT  zhr_test_ngo_attachment.
    PARAMETERS: p_cand  TYPE  hrobject DEFAULT '01NA50000305'
              , p_head TYPE rcf_attachment_header
              , p_fname         TYPE  string.
    DATA:   v_record        TYPE  rcf_s_mdl_cand_attachment,
            v_content       TYPE  rcf_attachment_content,
            v_messages      TYPE  bapirettab,
            v_records       TYPE  rcf_t_mdl_cand_attachment,
            v_size          TYPE  i,
            v_string        TYPE  xstring.
    DATA: BEGIN OF xml_tab OCCURS 0,
              raw TYPE x,
           END   OF xml_tab .
    v_record-att_type = '0012'.
    v_record-language = 'E'.
    v_record-att_header = p_head.
    *Get file
    CALL FUNCTION 'GUI_UPLOAD'
      EXPORTING
        filename            = p_fname
        filetype            = 'BIN'
        has_field_separator = ' '
        header_length       = 0
      IMPORTING
        filelength          = v_size
      TABLES
        data_tab            = xml_tab
      EXCEPTIONS
        OTHERS              = 1.
    v_content = xml_tab[].
    CALL FUNCTION 'HRRCF_MDL_CAND_ATT_CREATE'
      EXPORTING
        record        = v_record
        cand_hrobject = p_cand
        content       = v_content
        filename      = p_fname
      IMPORTING
        messages      = v_messages.
    IF sy-subrc <> 0.
      WRITE 'Attachment Error'.
    ENDIF.

Maybe you are looking for

  • Can XI Message be mapped to a flat file attachment in Mail adapter

    Hi guys, I have a requirement, where I have ECC system seding a XI message to XI. Now XI has to convert this to a flat file may be tab delimited and send this as an attachment in email using receiver mail adapter. I want to avoid BPM and make it simp

  • Selective Deletion is possible based on request ID?

    Hello Friends, Our Cube is Not Compressed where Aggregate Rollup was done and Compression of Aggregate rollup are done. In this Case, Can i peform selective delete of data based on Request ID? If yes, 1) Any special procedure and need to take care of

  • Different types of mapping techics

    hi expects,    please provide some mapping technics.

  • Problems with metadata on iPhone

    A large portion of my music in iTunes isn't from iTunes, the majority are CD rips and from sites like Beatport. In iTunes, I keep everything organized, all fields are filled out correctly, and I keep the same fields filled out with every song. This i

  • 4 x 6 Recipe Cards

    I am still old fashion and would like to keep a recipe box.  I have 4 X 6 recipe cards that I would like to be able to type the recipes on the computer and then print them on the cards.  I can't seem to find a template for this purpose.  Any suggesti