Data Filter on CSV file using Data Synchronization
gOT error When i used Data Filter on CSV file in Data Synchronization task, Filter condition : BILLINGSTATE LIKE 'CA'
TE_7002 Transformation stopped due to a fatal error in the mapping. The expression [(BILLINGSTATE LIKE 'CA')] contains the following errors [<<PM Parse Error>> missing operator ... (BILLINGSTATE>>>> <<<<LIKE 'CA')].
Hi,
Yes,This can be done through BEx Broadcaster.
Please follow the below stes...
1.Open your query in BEx Analyzer
2.Go to BEx Analysis Toolbar->Tools->BEx Broadcaster...
3.Click "Create New Settings"->Select the "Distribution Type" as "Broadcast Email" and "Output Format" as "CSV"
4.Enter the Recipients Email Address under "Recipients" tab
5.Enter the Subject and Body of the mail under "Texts" tab
6.Save the Setting and Execute it.
Now the Query data will be attached as a CSV file and sent to the recipents through Email.
Hope this helps you.
Rgds,
Murali
Similar Messages
-
Export batch data into CSV file using SQL SP
Hi,
I have created WCF-Custom receive adapter to poll Sql SP (WITH xmlnamespaces(DEFAULT 'Namespace' and For XML PATH(''), Type) . Get the result properly in batch while polling and but getting error while converting into CSV by using map.
Please can anyone give me some idea to export SQL data into CSV file using SP.How are you doing this.
You would have got XML representation for the XML batch received from SQL
You should have a flat-file schema representing the CSV file which you want to send.
Map the received XML representation of data from SQL to flat-file schema
have custom pipeline with flat-file assembler on the assembler stage of the send pipeline.
In the send port use the map which convert received XML from SQL to flat file schema and use the above custom flat-file disassembler send port
If this answers your question please mark it accordingly. If this post is helpful, please vote as helpful by clicking the upward arrow mark next to my reply. -
How to get Document Set property values in a SharePoint library in to a CSV file using Powershell
Hi,
How to get Document Set property values in a SharePoint library into a CSV file using Powershell?
Any help would be greatly appreciated.
Thank you.
AA.Hi,
According to your description, my understanding is that you want to you want to get document set property value in a SharePoint library and then export into a CSV file using PowerShell.
I suggest you can get the document sets properties like the PowerShell Command below:
[system.reflection.assembly]::loadwithpartialname("microsoft.sharepoint")
$siteurl="http://sp2013sps/sites/test"
$listname="Documents"
$mysite=new-object microsoft.sharepoint.spsite($siteurl)
$myweb=$mysite.openweb()
$list=$myweb.lists[$listname]
foreach($item in $list.items)
if($item.contenttype.name -eq "Document Set")
if($item.folder.itemcount -eq 0)
write-host $item.title
Then you can use Export-Csv PowerShell Command to export to a CSV file.
More information:
Powershell for document sets
How to export data to CSV in PowerShell?
Using the Export-Csv Cmdlet
Thanks
Best Regards
TechNet Community Support
Please remember to mark the replies as answers if they help, and unmark the answers if they provide no help. If you have feedback for TechNet Support, contact
[email protected] -
Issue while loading a csv file using sql*loader...
Hi,
I am loading a csv file using sql*loader.
On the number columns where there is data populated in them, decimal number/integers, the row errors out on the error -
ORA-01722: invalid number
I tried checking the value picking from the excel,
and found the chr(13),chr(32),chr(10) values characters on the value.
ex: select length('0.21') from dual is giving a value of 7.
When i checked each character as
select ascii(substr('0.21',5,1) from dual is returning a value 9...etc.
I tried the following command....
"to_number(trim(replace(replace(replace(replace(:std_cost_price_scala,chr(9),''),chr(32),''),chr(13),''),chr(10),'')))",
to remove all the non-number special characters. But still facing the error.
Please let me know, any solution for this error.
Thanks in advance.
Kirancontrol file:
OPTIONS (ROWS=1, ERRORS=10000)
LOAD DATA
CHARACTERSET WE8ISO8859P1
INFILE '$Xx_TOP/bin/ITEMS.csv'
APPEND INTO TABLE XXINF.ITEMS_STAGE
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' TRAILING NULLCOLS
ItemNum "trim(replace(replace(:ItemNum,chr(9),''),chr(13),''))",
cross_ref_old_item_num "trim(replace(replace(:cross_ref_old_item_num,chr(9),''),chr(13),''))",
Mas_description "trim(replace(replace(:Mas_description,chr(9),''),chr(13),''))",
Mas_long_description "trim(replace(replace(:Mas_long_description,chr(9),''),chr(13),''))",
Org_description "trim(replace(replace(:Org_description,chr(9),''),chr(13),''))",
Org_long_description "trim(replace(replace(:Org_long_description,chr(9),''),chr(13),''))",
user_item_type "trim(replace(replace(:user_item_type,chr(9),''),chr(13),''))",
organization_code "trim(replace(replace(:organization_code,chr(9),''),chr(13),''))",
primary_uom_code "trim(replace(replace(:primary_uom_code,chr(9),''),chr(13),''))",
inv_default_item_status "trim(replace(replace(:inv_default_item_status,chr(9),''),chr(13),''))",
inventory_item_flag "trim(replace(replace(:inventory_item_flag,chr(9),''),chr(13),''))",
stock_enabled_flag "trim(replace(replace(:stock_enabled_flag,chr(9),''),chr(13),''))",
mtl_transactions_enabled_flag "trim(replace(replace(:mtl_transactions_enabled_flag,chr(9),''),chr(13),''))",
revision_qty_control_code "trim(replace(replace(:revision_qty_control_code,chr(9),''),chr(13),''))",
reservable_type "trim(replace(replace(:reservable_type,chr(9),''),chr(13),''))",
check_shortages_flag "trim(replace(replace(:check_shortages_flag,chr(9),''),chr(13),''))",
shelf_life_code "trim(replace(replace(replace(replace(:shelf_life_code,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
shelf_life_days "trim(replace(replace(replace(replace(:shelf_life_days,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
lot_control_code "trim(replace(replace(:lot_control_code,chr(9),''),chr(13),''))",
auto_lot_alpha_prefix "trim(replace(replace(:auto_lot_alpha_prefix,chr(9),''),chr(13),''))",
start_auto_lot_number "trim(replace(replace(:start_auto_lot_number,chr(9),''),chr(13),''))",
negative_measurement_error "trim(replace(replace(replace(replace(:negative_measurement_error,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
positive_measurement_error "trim(replace(replace(replace(replace(:positive_measurement_error,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
serial_number_control_code "trim(replace(replace(:serial_number_control_code,chr(9),''),chr(13),''))",
auto_serial_alpha_prefix "trim(replace(replace(:auto_serial_alpha_prefix,chr(9),''),chr(13),''))",
start_auto_serial_number "trim(replace(replace(:start_auto_serial_number,chr(9),''),chr(13),''))",
location_control_code "trim(replace(replace(:location_control_code,chr(9),''),chr(13),''))",
restrict_subinventories_code "trim(replace(replace(:restrict_subinventories_code,chr(9),''),chr(13),''))",
restrict_locators_code "trim(replace(replace(:restrict_locators_code,chr(9),''),chr(13),''))",
bom_enabled_flag "trim(replace(replace(:bom_enabled_flag,chr(9),''),chr(13),''))",
costing_enabled_flag "trim(replace(replace(:costing_enabled_flag,chr(9),''),chr(13),''))",
inventory_asset_flag "trim(replace(replace(:inventory_asset_flag,chr(9),''),chr(13),''))",
default_include_in_rollup_flag "trim(replace(replace(:default_include_in_rollup_flag,chr(9),''),chr(13),''))",
cost_of_goods_sold_account "trim(replace(replace(:cost_of_goods_sold_account,chr(9),''),chr(13),''))",
std_lot_size "trim(replace(replace(replace(replace(:std_lot_size,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
sales_account "trim(replace(replace(:sales_account,chr(9),''),chr(13),''))",
purchasing_item_flag "trim(replace(replace(:purchasing_item_flag,chr(9),''),chr(13),''))",
purchasing_enabled_flag "trim(replace(replace(:purchasing_enabled_flag,chr(9),''),chr(13),''))",
must_use_approved_vendor_flag "trim(replace(replace(:must_use_approved_vendor_flag,chr(9),''),chr(13),''))",
allow_item_desc_update_flag "trim(replace(replace(:allow_item_desc_update_flag,chr(9),''),chr(13),''))",
rfq_required_flag "trim(replace(replace(:rfq_required_flag,chr(9),''),chr(13),''))",
buyer_name "trim(replace(replace(:buyer_name,chr(9),''),chr(13),''))",
list_price_per_unit "trim(replace(replace(replace(replace(:list_price_per_unit,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
taxable_flag "trim(replace(replace(:taxable_flag,chr(9),''),chr(13),''))",
purchasing_tax_code "trim(replace(replace(:purchasing_tax_code,chr(9),''),chr(13),''))",
receipt_required_flag "trim(replace(replace(:receipt_required_flag,chr(9),''),chr(13),''))",
inspection_required_flag "trim(replace(replace(:inspection_required_flag,chr(9),''),chr(13),''))",
price_tolerance_percent "trim(replace(replace(replace(replace(:price_tolerance_percent,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
expense_account "trim(replace(replace(:expense_account,chr(9),''),chr(13),''))",
allow_substitute_receipts_flag "trim(replace(replace(:allow_substitute_receipts_flag,chr(9),''),chr(13),''))",
allow_unordered_receipts_flag "trim(replace(replace(:allow_unordered_receipts_flag,chr(9),''),chr(13),''))",
receiving_routing_code "trim(replace(replace(:receiving_routing_code,chr(9),''),chr(13),''))",
inventory_planning_code "trim(replace(replace(:inventory_planning_code,chr(9),''),chr(13),''))",
min_minmax_quantity "trim(replace(replace(replace(replace(:min_minmax_quantity,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
max_minmax_quantity "trim(replace(replace(replace(replace(:max_minmax_quantity,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
planning_make_buy_code "trim(replace(replace(:planning_make_buy_code,chr(9),''),chr(13),''))",
source_type "trim(replace(replace(:source_type,chr(9),''),chr(13),''))",
mrp_safety_stock_code "trim(replace(replace(:mrp_safety_stock_code,chr(9),''),chr(13),''))",
material_cost "trim(replace(replace(:material_cost,chr(9),''),chr(13),''))",
mrp_planning_code "trim(replace(replace(:mrp_planning_code,chr(9),''),chr(13),''))",
customer_order_enabled_flag "trim(replace(replace(:customer_order_enabled_flag,chr(9),''),chr(13),''))",
customer_order_flag "trim(replace(replace(:customer_order_flag,chr(9),''),chr(13),''))",
shippable_item_flag "trim(replace(replace(:shippable_item_flag,chr(9),''),chr(13),''))",
internal_order_flag "trim(replace(replace(:internal_order_flag,chr(9),''),chr(13),''))",
internal_order_enabled_flag "trim(replace(replace(:internal_order_enabled_flag,chr(9),''),chr(13),''))",
invoice_enabled_flag "trim(replace(replace(:invoice_enabled_flag,chr(9),''),chr(13),''))",
invoiceable_item_flag "trim(replace(replace(:invoiceable_item_flag,chr(9),''),chr(13),''))",
cross_ref_ean_code "trim(replace(replace(:cross_ref_ean_code,chr(9),''),chr(13),''))",
category_set_intrastat "trim(replace(replace(:category_set_intrastat,chr(9),''),chr(13),''))",
CustomCode "trim(replace(replace(:CustomCode,chr(9),''),chr(13),''))",
net_weight "trim(replace(replace(replace(replace(:net_weight,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
production_speed "trim(replace(replace(:production_speed,chr(9),''),chr(13),''))",
LABEL "trim(replace(replace(:LABEL,chr(9),''),chr(13),''))",
comment1_org_level "trim(replace(replace(:comment1_org_level,chr(9),''),chr(13),''))",
comment2_org_level "trim(replace(replace(:comment2_org_level,chr(9),''),chr(13),''))",
std_cost_price_scala "to_number(trim(replace(replace(replace(replace(:std_cost_price_scala,chr(9),''),chr(32),''),chr(13),''),chr(10),'')))",
supply_type "trim(replace(replace(:supply_type,chr(9),''),chr(13),''))",
subinventory_code "trim(replace(replace(:subinventory_code,chr(9),''),chr(13),''))",
preprocessing_lead_time "trim(replace(replace(replace(replace(:preprocessing_lead_time,chr(9),''),chr(32),''),chr(13),''),chr(10),''))",
processing_lead_time "trim(replace(replace(replace(replace(:processing_lead_time,chr(9),''),chr(32),''),chr(13),''),chr(10),''))",
wip_supply_locator "trim(replace(replace(:wip_supply_locator,chr(9),''),chr(13),''))"
Sample data from csv file.
"9901-0001-35","390000","JMKL16 Pipe bend 16 mm","","JMKL16 Putkikaari 16 mm","","AI","FJE","Ea","","","","","","","","","","","","","","","","","","","","","","","","","21-21100-22200-00000-00000-00-00000-00000","0","21-11100-22110-00000-00000-00-00000-00000","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","0.1","Pull","AFTER PROD","","","Locator for Production"
The load errors out on especially two columns :
1) std_cost_price_scala
2) list_price_per_unit
both are number columns.
And when there is data being provided on them. It errors out. But, if they are holding null values, the records go through fine.
Message was edited by:
KK28 -
Adding row into existing CSV file using C#
How to add row to existing CSV file using .NET Code.the file should not be overwrite,it need add another row with data.how to implement this scenario.
Hi BizQ,
If you only just write some data to CSV file. Please follow A.Zaied and Magnus 's reply. In general,we use CSV file to import or export some data. Like following thread and a good article in codeproject
Convert a CSV file to Excel using C#
Writing a DataTable to a CSV file
Best regards,
Kristin
We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
Click
HERE to participate the survey. -
How to get DocSet property values in a SharePoint library into a CSV file using Powershell
Hi,
How to get DocSet property values in a SharePoint library into a CSV file using Powershell?
Any help would be greatly appreciated.
Thank you.
AA.Hi AOK,
Would you please post your current script and the issue for more effcient support.
In addition, to manage document set in sharepoint please refer to this script to start:
### Load SharePoint SnapIn
2.if ((Get-PSSnapin "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue) -eq $null)
3.{
4. Add-PSSnapin Microsoft.SharePoint.PowerShell
5.}
6.### Load SharePoint Object Model
7.[System.Reflection.Assembly]::LoadWithPartialName(“Microsoft.SharePoint”)
8.
9.### Get web and list
10.$web = Get-SPWeb http://myweb
11.$list = $web.Lists["List with Document Sets"]
12.
13.### Get Document Set Content Type from list
14.$cType = $list.ContentTypes["Document Set Content Type Name"]
15.
16.### Create Document Set Properties Hashtable
17.[Hashtable]$docsetProperties = @{"DocumentSetDescription"="A Document Set"}
18.$docsetProperties = @{"CustomColumn1"="Value 1"}
19.$docsetProperties = @{"CustomColum2"="Value2"}
20. ### Add all your Columns for your Document Set
21.
22.### Create new Document Set
23.$newDocumentSet = [Microsoft.Office.DocumentManagement.DocumentSets.DocumentSet]::Create($list.RootFolder,"Document Set Title",$cType.Id,$docsetProperties)
24.$web.Dispose()
http://www.letssharepoint.com/2011/06/document-sets-und-powershell.html
If there is anything else regarding this issue, please feel free to post back.
Best Regards,
Anna Wang
Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Support, contact [email protected] -
Reading csv file using file adapter
Hi,
I am working on SOA 11g. I am reading a csv file using a file adapter. Below are the file contents, and the xsd which gets generated by the Jdev.
.csv file:
empid,empname,empsal
100,Ram,20000
101,Shyam,25000
xsd generated by the Jdev:
<?xml version="1.0" encoding="UTF-8" ?>
<xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:nxsd="http://xmlns.oracle.com/pcbpel/nxsd" xmlns:tns="http://TargetNamespace.com/EmpRead" targetNamespace="http://TargetNamespace.com/EmpRead" elementFormDefault="qualified" attributeFormDefault="unqualified"
nxsd:version="NXSD"
nxsd:stream="chars"
nxsd:encoding="ASCII"
nxsd:hasHeader="true"
nxsd:headerLines="1"
nxsd:headerLinesTerminatedBy="${eol}">
<xsd:element name="Root-Element">
<xsd:complexType>
<xsd:sequence>
<xsd:element name="Child-Element" minOccurs="1" maxOccurs="unbounded">
<xsd:complexType>
<xsd:sequence>
<xsd:element name="empid" type="xsd:string" nxsd:style="terminated" nxsd:terminatedBy="," nxsd:quotedBy=""" />
<xsd:element name="empname" minOccurs="1" type="xsd:string" nxsd:style="terminated" nxsd:terminatedBy="," nxsd:quotedBy=""" />
<xsd:element name="empsal" type="xsd:string" nxsd:style="terminated" nxsd:terminatedBy="${eol}" nxsd:quotedBy=""" />
</xsd:sequence>
</xsd:complexType>
</xsd:element>
</xsd:sequence>
</xsd:complexType>
</xsd:element>
</xsd:schema>
For empname i have added minoccurs=1. Now when i remove the empname column, the csv file still gets read from the server, without giving any error.
Now, i created the following xml file, and read it through the file adapter:
<?xml version="1.0" encoding="UTF-8" ?>
<Root-Element xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://TargetNamespace.com/EmpRead xsd/EmpXML.xsd" xmlns="http://TargetNamespace.com/EmpRead">
<Child-Element>
<empid>100</empid>
<empname></empname>
<empsal>20000</empsal>
</Child-Element>
<Child-Element>
<empid>101</empid>
<empname>Shyam</empname>
<empsal>25000</empsal>
</Child-Element>
</Root-Element>
When i removed the value of empname, it throws the proper error for the above xml.
Please tell me why the behaviour of file adapter is different for the csv file and the xml file for the above case.
ThanksHi,
I am working on SOA 11g. I am reading a csv file using a file adapter. Below are the file contents, and the xsd which gets generated by the Jdev.
.csv file:
empid,empname,empsal
100,Ram,20000
101,Shyam,25000
xsd generated by the Jdev:
<?xml version="1.0" encoding="UTF-8" ?>
<xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:nxsd="http://xmlns.oracle.com/pcbpel/nxsd" xmlns:tns="http://TargetNamespace.com/EmpRead" targetNamespace="http://TargetNamespace.com/EmpRead" elementFormDefault="qualified" attributeFormDefault="unqualified"
nxsd:version="NXSD"
nxsd:stream="chars"
nxsd:encoding="ASCII"
nxsd:hasHeader="true"
nxsd:headerLines="1"
nxsd:headerLinesTerminatedBy="${eol}">
<xsd:element name="Root-Element">
<xsd:complexType>
<xsd:sequence>
<xsd:element name="Child-Element" minOccurs="1" maxOccurs="unbounded">
<xsd:complexType>
<xsd:sequence>
<xsd:element name="empid" type="xsd:string" nxsd:style="terminated" nxsd:terminatedBy="," nxsd:quotedBy=""" />
<xsd:element name="empname" minOccurs="1" type="xsd:string" nxsd:style="terminated" nxsd:terminatedBy="," nxsd:quotedBy=""" />
<xsd:element name="empsal" type="xsd:string" nxsd:style="terminated" nxsd:terminatedBy="${eol}" nxsd:quotedBy=""" />
</xsd:sequence>
</xsd:complexType>
</xsd:element>
</xsd:sequence>
</xsd:complexType>
</xsd:element>
</xsd:schema>
For empname i have added minoccurs=1. Now when i remove the empname column, the csv file still gets read from the server, without giving any error.
Now, i created the following xml file, and read it through the file adapter:
<?xml version="1.0" encoding="UTF-8" ?>
<Root-Element xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://TargetNamespace.com/EmpRead xsd/EmpXML.xsd" xmlns="http://TargetNamespace.com/EmpRead">
<Child-Element>
<empid>100</empid>
<empname></empname>
<empsal>20000</empsal>
</Child-Element>
<Child-Element>
<empid>101</empid>
<empname>Shyam</empname>
<empsal>25000</empsal>
</Child-Element>
</Root-Element>
When i removed the value of empname, it throws the proper error for the above xml.
Please tell me why the behaviour of file adapter is different for the csv file and the xml file for the above case.
Thanks -
How to upload a .CSV file using GUI_UPLOAD
Hi Experts,
In my report, I need to upload .CSV file using GUI_upload..So how to do ....Plz provide solution...Hi prashanthishetty,
this is already answered many times in this forum!
use forum search or wiki search
[http://wiki.sdn.sap.com/wiki/display/Snippets/uploadcsvfilesintointernal+table]
regards
rea -
Is it possible to monitor State change of a .CSV file using powershell scripting ?
Hi All,
I just would like to know Is it possible to monitor State change of a .CSV file using powershell scripting ? We have SCOM tool which has that capability but there are some drawbacks in that for which we are not able to utilise that. So i would like
to know is this possible using powershell.
So if there is any number above 303 in the .CSV file then i need a email alert / notification for the same.
Gautam.75801Hi Jrv,
Thank you very much. I modified the above and it worked.
Import-Csv C:\SCOM_Tasks\GCC2010Capacitymanagement\CapacityMgntData.csv | ?{$_.Mailboxes -gt 303} | Export-csv -path C:\SCOM_Tasks\Mbx_Above303.csv;
Send-MailMessage -Attachments "C:\SCOM_Tasks\Mbx_Above303.csv" -To “[email protected]" -From “abc@xyz" -SMTPServer [email protected] -Subject “Mailboxex are above 303 in Exchange databases” -Body “Mailboxex are above 303 in Exchange databases"
Mailboxex - is the line which i want to monitor if the values there are above 303. And it will extract the lines with all above 303 to another CSV file and 2nd is a mail script to email me the same with the attachment of the 2nd extract.
Gautam.75801 -
How to merge two columns in csv file using vbscript?
How i can merge two column in csv file using vbscript or powershell?
1 , 2 , 3 , 4 , 5
1 , 23 , 4 , 5
thanksHere are two examples
http://msdn.microsoft.com/en-us/library/ms191250.aspx
calulated
http://msdn.microsoft.com/en-us/library/ms191250.aspx
Gary Newman MCSE, MCT, CCNA, MCSD, MCPD, MCDBA, MCAD, MCSA, MCTS Developer and Administrator On SharePoint 2013 SharePoint Practice Manager for AmeriTeach Denver, CO. -
Re: How to Import .CSV file using Mac Desktop Manager 1.0.4?
I am at a total loss. I have exported my existing address book of 3100 contacts and it can be in any format - csv or whatever.
It is from an app called NowContcat and up to date. You would not know it.
Anyway all I want to do is import to my BB Bold 9000 running the latest upgrades.
Can anyone help? - I have tried to read all of the posts but there does not seem to be anything i the manual or otherwise.
SDKHey kempyau,
To clarify your issue, are you looking to take the csv file of 3100 contacts and synchronize it with your BlackBerry?
You can import the contacts in Mac Address Book and then use BlackBerry Desktop Software to synchronize the contacts.
-ViciousFerret
Come follow your BlackBerry Technical Team on Twitter! @BlackBerryHelp
Be sure to click Like! for those who have helped you.
Click Accept as Solution for posts that have solved your issue(s)! -
Issue in conversion of output file from alv to csv file using GUI_DOWNLOAD
hi,
I am using GUI_DOWNLOAD to convert the internal table that am getting as the output of an alv into a csv(comma separated file) file.I am using the following code but its not generating a csv file instead it is generating a normal space delimited file.
The code is as follows:
data : lv_fname type string.
lv_fname = 'C:\Users\pratyusha_tripathi\Desktop\status8.csv'. " Provide the file path & file name with CSV extention
CALL FUNCTION 'GUI_DOWNLOAD'
EXPORTING
filename = lv_fname " File name including path, give CSV as extention of the file
FILETYPE = 'DAT'
WRITE_FIELD_SEPARATOR = '#' " Provide comma as separator
tables
data_tab = ITAB " Pass the Output internal table
FIELDNAMES =
EXCEPTIONS
OTHERS = 22
IF sy-subrc 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
Kindly let me know what changes can be made to make my code work.Also can GUI_download be used for batch processing and storing the output in application server?
Thanks ,
PratyushaHi,
the short text description for WRITE_FIELD_SEPARATOR is "Separate Columns by Tabs in Case of ASCII Download", so why do you expect a comma?
Try SAP_CONVERT_TO_CSV_FORMAT and then download.
And no, GUI_DOWNLOAD is only for download via SAP GUI to a users computer.
Best regards,
Oliver -
Split records into Multiple csv files using a Threshold percentage
Hi Gurus,
I have a requirement to split the data into two csv file from a table using a threshold value(in Percentage) .
Assume that If my source select query of interface fetches 2000 records , I will provide a threshold value like 20%.
I need to generate a csv1 with 400 records(20% of 2000) and the rest of the records into another csv2.
For implementing this I am trying to use the following process.
1) Create a procedure with the select query to get the count of records.
Total Records count: select count(1) from source_table <Joins> <Lookups> <Conditions>;
2) Calculate the Record count to first CSV using the threshold_value.
CSV1_Count=Total records count /threshold_value
3) Create a view that fetches the CSV1_Count(400) records for CSV1 as follows.
Create view CSV1_view as select Col1,Col2,Col3 from source_table <Joins> <Lookups> <Conditions>
Where rownum<=CSV1_Count;
4) Generate CSV1 file using View 'CSV1_View'
5) Generate CSV2 File using the Interface with same select statement (with columns ) to generate a CSV.
select Col1,Col2,Col3 from source_table ST <Joins> <Lookups> <Conditions>
Left outer join (Select Col1 from CSV1_View ) CS on CS.Col1=ST.Col1 where CS.Col1 is null;
Which gives the Total records minus the CS1_View records.
The above process seems a bit complex and very simple . If any changes in my Interface I also need to change the procedure (counts the no:of records).
Please provide your comments and feedback about this and looking for your inputs for any new simple approach or fine tune the above approach.
Thanks,
ArjunArjun,
This are my thoughts and Lets do it in 3 Steps
Step 1. ODI Procedure
Drop table Temp_20 ;
Create table Temp_20 as select * from table where rownum < ( SELECT TRUNC( COUNT(1) /5) FROM TABLE ) .
[ ** This way iam fetching approx 20% of the table data and loading into Temp table . 1/5 th is 20% so i am dividing count by 5
I don't believe View will help you especially with RowNum as if you run the same query with rownum < N the rows order might differ . so Temp table is great ]
Step 2 . Use OdiSqlUnload with select columns from temp_20
Step 3 . Use again OdiSqlUnload with select columns from table where ( uk keys ) not in ( selecy uk_keys from temp_20)
[** this way you can pick the remaining 80% ** and the data will be not repeat itself across 20% and 80% , as might happen with view ]
what do you think ? -
Download a UNIX based CGI page into a CSV file using Powershell
Hello,
I have been struggling to get a powershell script to work.
I have a web-based page which ends in .cgi and I want to copy the text from that page to a CSV file.
Normally the below script works for .html or .aspx types of pages but this time I have been getting the 'The remote server returned an error: (401) Unauthorized' page. I think the error is because the page is Unix based and for some reason I am not
able to automate the authentication process.
Below is the Powershell script I have been using. I have already created the securestring for my password and saved in a temp folder.
$wc=new-object system.net.webclient
$password = get-content 'C:\temp\Scripts\SecureString.txt' | convertto-securestring
$wc.Credentials = new-object -typename System.Management.Automation.PSCredential -argumentlist "Domain\username", $password
$wc.downloadfile("https://website.domain.com/cgi-bin/text_status_csv.cgi", "C:\temp\Scripts\output.csv")
Currently I am able to successfully copy the data from the page over if I use
$wc.Credentials = Get-Credential
And then type in the username and password manually. I could really use some help to get this automated.
Thanks in advance for your helpA little searching and you will find hundreds of examples.
https://www.google.com/#newwindow=1&q=powershell+persist+credentials
¯\_(ツ)_/¯ -
Getting empty csv file using servlet
Hi
i am working on reports for my web application and i used struts frame work.
for my reports i want csv export, so for that i written servlet, once if i click generate button i am able to open popup window to save the generated csv file at my local system, but i am getting emplty csv file..
nothing si ther ein that file forget abt data atleast my header fields.
here is my servlet file..plz let me know where i am doing wrong..
public class ReportServlet extends HttpServlet{
public void doPost(HttpServletRequest req,HttpServletResponse res)
throws ServletException,IOException
PrintWriter out = res.getWriter();
res.setContentType("text/csv");
res.setHeader("Content-Disposition","attachment; filename=\"export.csv\"");
out = res.getWriter();
AdvDetailReportBean reportBean = null;
ArrayList list =(ArrayList)req.getSession().getAttribute("advreportlist");
System.out.println(" servlet report list size is"+list.size());
String branchcode=(String)req.getSession().getAttribute("branchcode");
String bName=(String)req.getSession().getAttribute("branchname");
System.out.println(" servlet branch name"+bName);
System.out.println(" servlet branch code"+branchcode);
StringBuffer fw = new StringBuffer();
fw.append("Branch Code");
fw.append(',');
fw.append("Branch Name");
fw.append('\n');
fw.append(branchcode);
fw.append(',');
fw.append(bName);
fw.append('\n');
fw.append('\n');
fw.append("Customer Name");
fw.append(',');
fw.append("Constitution Code");
fw.append(',');
fw.append("Customer Status");
fw.append(',');
fw.append("Restructure Date");
fw.append(',');
fw.append("Total Provision");
fw.append(',');
fw.append("Limit Sanctioned");
fw.append(',');
fw.append("Principal");
fw.append(',');
fw.append("Balance");
fw.append(',');
fw.append("AccountID");
fw.append(',');
fw.append("Collateral SL No");
fw.append(',');
fw.append("Issue Date Of Collateral");
fw.append(',');
fw.append("MaturityDate Of Collateral");
fw.append(',');
fw.append("Subsidy");
fw.append(',');
fw.append("Guarantor SL No");
fw.append(',');
fw.append("Guarantor Rating Agency ");
fw.append(',');
fw.append("External Rating of Guarantor");
fw.append(',');
fw.append("Rating Expiry Date");
fw.append(',');
fw.append("Guarantee Amount");
fw.append(',');
fw.append('\n');
for (Iterator it = list.iterator(); it.hasNext(); )
reportBean = new AdvDetailReportBean();
reportBean = (AdvDetailReportBean)it.next();
fw.append(reportBean.getCustomername());
fw.append(',');
fw.append(reportBean.getConstitutionCode());
fw.append(',');
fw.append(reportBean.getCustomerStatus());
fw.append(',');
fw.append(reportBean.getRestructureDate());
fw.append(',');
fw.append(reportBean.getTotalProvision());
fw.append(',');
fw.append(reportBean.getLimitSanctioned());
fw.append(',');
fw.append(reportBean.getPrincipal());
fw.append(',');
fw.append(reportBean.getBalance());
fw.append(',');
fw.append(reportBean.getCurrentValue());
fw.append(',');
fw.append(reportBean.getAccountNumber());
fw.append(',');
fw.append(reportBean.getColCRMSecId());
fw.append(',');
fw.append(reportBean.getIssueDt());
fw.append(',');
fw.append(reportBean.getMarturityDt());
fw.append(',');
fw.append(reportBean.getUnAdjSubSidy());
fw.append(',');
fw.append(reportBean.getGuarantorFacilityId());
fw.append(',');
fw.append(reportBean.getRatingAgency());
fw.append(',');
fw.append(reportBean.getExternalRating());
fw.append(',');
fw.append(reportBean.getExpDtOfRating());
fw.append(',');
fw.append(reportBean.getGuaranteeAmt());
fw.append(',');
fw.append('\n');
}You don't seem to be writing anything to the response at all. Yes, you create a StringBuffer and write lots of stuff to the StringBuffer but then you do nothing else with that buffer.
Maybe you are looking for
-
Before this last update, I used to just click the airplay symbol and pull up whether I wanted to play my itunes through my apple tv or my computer speakers. Now, for one, there isn't even a symbol when itunes is expanded. When I collapse it to mini p
-
How to submit a report in 30 Minutes?
Hi, i have this short report with submit: DATA: NUMBER TYPE TBTCJOB-JOBCOUNT, NAME TYPE TBTCJOB-JOBNAME VALUE 'JOB_TEST', PRINT_PARAMETERS TYPE PRI_PARAMS. CALL FUNCTION 'JOB_OPEN' EXPORTING JOBNAME = NAME I
-
Problem setting up ColdFusion on Windows 7 and IIS
Using Windows 7 version 6.1 Build 7601 Service Pack 1 on a Dell N5110 Laptop running an Intel Core i5-2430M CPU @ 2.4 GHz and 6 GB RAM, 64-bit OS. Under Control Panel > Programs > Turn Windows Features On and Off > Internet Information Services I hav
-
Using bluetooth for ipod touch and new computer
My computer with itunes crashed. Can I use bluetooth to put the music on the ipod touch to my new computer and new itunes?
-
IT7 I am using a esata drive connected to my expresscard slot. I was downloading a movie and had about 800meg downloaded. The card got unplugged which dis-connected the drive. when I went back in to restart download it started from the very beginning