Port required for DNS Integrated Zone replication

Hi All,
A segment of the network is secured through a firewall, inside this segment I have a Windows 2012R2 DNS Server that hosts also Active Directory integrated zones, what ports should I allow so that the DNS server can replicate the DNS zone from and to the
main network?
I read this https://technet.microsoft.com/en-us/library/dd772723%28WS.10%29.aspx?f=255&MSPPError=-2147217396
but I would like to limit the port to the minimum

Hello,
you wrote "inside this segment I have a Windows 2012 R2 DNS Server that also hosts AD integrated zones"
So this server is a domain controller.
Best regards
Meinolf Weber
MVP, MCP, MCTS
Microsoft MVP - Directory Services
My Blog: http://blogs.msmvps.com/MWeber
Disclaimer: This posting is provided AS IS with no warranties or guarantees and confers no rights.
Twitter:  

Similar Messages

  • Ports required for voice gateway registration

    Hi,
    Currently our remote office voice gateway is trying to register to the CM and in between there is a firewall. We have opened port DNS, NTP, 2427 and 2428 but it still showing registering to the call manager. What other ports shall we open to make it works?
    What about the port requirement for CUE?
    Thanks.

    For MGCP:
    DNS
    NTP
    UDP 2427
    TCP 2428
    TFTP (UDP 69)
    For CUE, here is a link you may find helpful:
    http://www.cisco.com/en/US/partner/netsol/ns340/ns394/ns165/ns391/networking_solutions_design_guidance09186a00801f8e31.html#wp41149
    hth,
    nick

  • What are the ports required for the Audio, Video and A/V conferencing when the following end points are enabled for QoS in Lync 2013 server?

    Hi All,
    What are the ports required for the Audio, Video and A/V conferencing when the following clients are enabled for QoS in Lync 2013 server?
    Client Type
    Port range  and Protocol required for Audio
    Port range and Protocol required for
    Video
    Port range and Protocol required for
    A/Vconferencing
    Windows Desktop   Client
    Windows mobile App
    Iphone
    Ipad
    Andriod phone
    Andriod Tablet
    MAC desktop client
    Please advise. Many Thanks.

    Out of the box, 1024-65535 for all of the client ports.  :) 
    https://technet.microsoft.com/en-us/library/gg398833.aspx
    You'll want to tune your client ports a bit
    https://technet.microsoft.com/en-us/library/jj204760.aspx as seen here, and then the client ports would use those ranges which is easier to set QoS markings.  I'm not sure the mobile clients respect that setting.
    Elan's got the best writeup for Windows clients here:
    http://www.shudnow.net/2013/02/16/enabling-qos-for-lync-server-2013-and-various-clients-part-1/
    However, the marking of the packets is the tricky part.  Windows can do it via Group Policy, but for the other clients you'll need to have the network specifically prioritize ports regardless of DSCP markings.  You have to do it based on ports
    as the traffic could be peer to peer.
    Please remember, if you see a post that helped you please click "Vote As Helpful" and if it answered your question please click "Mark As Answer".
    SWC Unified Communications
    This forum post is based upon my personal experience and does not necessarily reflect the opinion or view of Microsoft, its employees, or other MVPs.

  • Mapping Error: Mappings are not required for this Integrator

    Hi,
    when I created my upload integrator with the Desktop Integrator Manager (12.1.2) I am not able to attach a mapping to it. I get the following error:
    Mapping Error: Mappings are not required for this Integrator because the ability to download information has not been enabled.
    Does anyone know what I've missed?
    cheers
    Jeroen

    Hi ,
    Without creating the content if you are trying to do mapping this error will come.
    Solution:
    Create the content and then do mapping..
    Regards,
    Sreekanth.S
    Edited by: user12045904 on Dec 9, 2010 10:55 PM

  • Ports Required for SCCM Distribution point

    Hi All,
    Can anybody tell me the ports required for distribution point. as in I have a site server with distribution point in HO, i want to have a distribution point server on a remote site. i went through the microsoft document which says u require port 445 open between site server and distribution point and it says u only require port 80 or 443 open between distribution point and branch distribution point. little confused!!!. which ports i require to be open between site server with distribution and remote site distribution point.
    Cheers.

    Hi,
    Regarding the doc here:
    http://technet.microsoft.com/en-us/library/bb632618.aspx
    Site Server -- > Distribution Point
    SMB 445 TCP
    RPC Endpoint Mapper 135 UDP and TCP
    RPC dynamic TCP
    Site Server < -- > Site Server
    SMB 445 TCP
    Point to Point Tunneling Protocal (PPTP) 1723 TCP
    Follow me through my blog and Twitter!

  • Port required for Veritas cluster implementation

    hello there ,
    i need to know what are the port required for veritas cluster implementation on Sun Messaging Server 6.2 . anybody care to help me on this ?
    thanks

    > We are planning a 2 node Oracle 9i RAC cluster on Sun
    Cluster 3.Good. This is a popular configuration.
    Can you please explain these 2 questions?
    1)
    If we have a hardware disk array RAID controller with
    LUNs etc, then why do we need to have Veritas Volume
    Manager (VxVM) if all the LUNS are configured at a
    hardware level?VxVM is not required to run RAC. VxVM has an option (separately
    licensable) which is specifically designed for OPS/RAC. But if
    you have a highly reliable, multi-pathed, hardware RAID platform,
    you are not required to have VxVM.
    2)
    Do we need to have VxFS? All our Oracle database
    files will be on raw partitions.No.
    IMHO, simplify is a good philosophy. Adding more software
    and layers into a highly available design will tend to reduce
    the availability. So, if you are going for maximum availabiliity,
    you will want to avoid over-complicating the design. KISS.
    In the case of RAC, or Oracle in general, many people do use
    raw and Oracle has the ability to manage data in raw devices
    pretty well. Oracle 10g further improves along these lines.
    A tenet in the design of highly available systems is to keep
    the data management as close to the application as possible.
    Oracle, and especially 10g, are following this tenet. The only
    danger here is that they could try to get too clever, and end up
    following policies which are suboptimal as the underlying
    technologies change. But even in this case, the policy is
    coming from the application rather than the supporting platform.
    -- richard

  • Ports required for GG setup (Oracle to Oracle replication )

    GG version: 11.2.1.0.1
    OS : RHEL 5.4
    We are going to configure GoldgenGate which is going to replicate the DMLs for few tables (Uni-directonal) from source to target.
    Since there is a firewall between source and target, We need to request the network team to open ports at both source and target servers.
    For manager process , we are going to use the default 7809 both at source and Target. What are the other ports that we need to request the network team to open for both servers ?

    Hi,
    If a firewall is being used at an Oracle GoldenGate target location, additional ports are required on the target system to receive dynamic TCP/IP communications from remote
    Oracle GoldenGate processes. These ports are:
    ● One port for each Collector process that is started by the local Manager to receive propagated transaction data from remote online Extract processes. When an Extract
    process sends data to a target, the Manager on the target starts a dedicated Collector process.
    ● One port for each Replicat process that is started by the local Manager as part of a remote task. A remote task is used for initial loads and is specified with the RMTTASK
    parameter. This port is used to receive incoming requests from the remote Extract process.
    ● Some extra ports in case they are needed for expansion of the local Oracle GoldenGate configuration.
    ● Ports for the other Oracle GoldenGate products if they interact with the local Oracle GoldenGate instance, as stated in the documentation of those products.
    To specify these ports, use the DYNAMICPORTLIST parameter in the Manager parameter file.
    Follow these guidelines:
    ● You can specify up to 5000 ports in any combination of the following formats:
    7830, 7833, 7835
    7830-7835
    7830-7835, 7839
    ● The ports must be unreserved and unrestricted.
    ● Each Manager instance on a system must use a different port list..
    Although not a required parameter, DYNAMICPORTLIST is strongly recommended for best performance. The Collector process is responsible for finding and binding to an available
    port, and having a known list of qualified ports speeds this process. In the absence of DYNAMICPORTLIST (or if not enough ports are specified with it), Collector tries to use port 7840 for remote requests. If 7840 is not available, Collector increments by one until it finds an available port. This can delay the acceptance of the remote request. If Collector runs out of ports in the DYNAMICPORTLIST list, the following occurs:
    ● Manager reports an error in its process report and in the Oracle GoldenGate ggserr log.
    ● Collector retries based on the rules in the Oracle GoldenGate tcperrs file
    For more information about PORT and DYNAMICPORTLIST, see the Oracle GoldenGate Windows and UNIX Reference Guide.
    Hopefully this will help you
    Annamalai.

  • Missing SOA record for AD integrated zone

    We are in the process of updating our Domain Controllers. We have 3 (now 2) Domain Controllers running server 2008 and 3 new Domain Controllers running Server 2012 R2. In DNS, we have 3 AD integrated zones. 1 of the zones is missing the SOA record on all
    3 new DCs.
    Before the record went missing, I first noticed an issue when attempting to demote one of the Server 2008 DCs.  I had received the following warning:
    "This Active Directory domain controller appears to be the last DNS server for the following Active Directory-integrated zones:
    zonename If you demote this domain controller, you may be unable to resolve any DNS names in these zones."
    I found a TechNet article with someone in the same boat, and the solution was basically to ignore the warning.
    Well I checked out the zone in question, and noticed that the SOA record on the 3 new DCs had an old version/serial number (25 on the new DCs, 126 on the old DCs).  This is a zone that rarely gets touched.  I did an increment serial number on one
    of the old DCs and they were then all showing version 127 for their SOA record, so the replication was working.  The other odd thing about the SOA record is that on the OLD DCs, the record pointed to themselves, like they should.  But on the new
    DCs, the SOA record pointed to other DCs, which they shouldn't be doing.  Well, whatever, I went ahead and completed demoting the 2008 DC I was working on.
    After demotion completed, I attempted to fix the messed up SOA record on the new DCs.  First I tried changing the replication scope from all DNS servers on DCs in the domain to all DCs (Win 2000 compatible).  This didn't do anything for the SOA
    record.  For my next attempt, I took one of the old DCs and removed the problem zone from AD, making it a standard primary zone.  Then I removed the AD-integrated version of the zone from another DC and waited for the zone removal to replicate. 
    So now the only copy of the zone is a standard primary on 1 DC.  Then I switched the zone to AD integrated and waited for replication.  Sure enough the zone appeared on all DCs.  However, now on the 3 new DCs, the zone in question is now missing
    the SOA record entirely.  On the old DCs, the SOA record looks fine.  When I open the zone properties on one of the new DCs and select the SOA tab, it just says "The data is not available."
    So there we go.  3 DCs missing their SOA record for an AD integrated zone.  Any suggestions?

    Well I found the source of the problem.  The zone giving me trouble has a CNAME record that's the same name as the zone itself.  For example, for the zone testZone.local there is a CNAME record called testZone.local that points to www.testZone.local.
     That way if someone types testZone.local into their browser, they end up at www.testZone.local.  After some research, I discovered that CNAME records cannot share the same name as any other record.  In my example, the CNAME record has the same
    name as the SOA and NS records in the zone.  Although many DNS servers allow this practice, it is not a valid DNS configuration.
    While this setup worked fine in our Server 2008 environment, it definitely causes problems in Server 2012 R2.  Somewhere between those 2 versions, Microsoft changed their DNS implementation.  So to avoid any issues with our zone, we just need to
    configure it correctly!

  • Ports Required for SMTP access from DMZ

    We have a Windows 2000 Adv Server on a DMZ interface of a PIX firewall. We are using native Windows SMTP services as a Front End server for Exchange mail. Our Exchange server has a SmartHost entry that sends outbound mail to the server on the DMZ. Our MX record points to the server on the DMZ for inbound traffic.
    We originally allowed DNS resolution and SMTP (Port 25) traffic to the server. We've done this numerous times from the Internal interface of the PIX. Yet, there apparently is at least one other port that needs to be opened up because the mail stays in the Queue of the SMTP server on the DMZ. We got around the problem by opening up all outbound ports from that server.
    My question is: "Does anyone know what ports are required for an SMTP server to work on a PIX DMZ?"
    Thanks

    Should just be TCP/25 and probably DNS (UDP/53). Probably the easiest way to figure out what other port it's using is to look at the active connections from this going through your PIX.
    Let's say the IP address of the mail server is 10.1.1.1. Doing:
    sho conn | include 10.1.1.1
    will give you all the connections. This will tell you where it's connectig to and on what ports. The output will look something like:
    FW1(config)# sho conn | incl 10.1.1.1
    UDP out 10.2.2.1:17127 in 10.1.1.1:10655 idle 0:01:23 Bytes 1000
    UDP out 10.2.2.1:18733 in 10.1.1.1:10477 idle 0:01:38 Bytes 1000
    UDP out 10.3.3.2:18429 in 10.1.1.1:10789 idle 0:01:10 Bytes 1000
    The numbers after the colons are the port numbers on the connection. Of course yours will show TCP and port 25 (and something else hopefully), but you get the idea.

  • Requirements for 9iAS Integration

    After days of plugging through the documents, reading manuals and searching google's archives, I am still not sure exactly what is need to get this going. We would like to set up the Integration piece (Interconnect, Workflow, IStudio) within our enterprise. We are trying to get this working on a Windows 2k server. I have formatted the server several times for one reason or another. I am hoping to avoid this in the future.
    I have amassed a bundle of questions:
    1. What parts of 9ias are required for this to work
    2. Does the Metadata Repository have to be on the same box as the Hub?
    3. Can I use an existing 9i database as the Metadata Repository instead of creating a new database?
    4. How do I define the path to the Workflow homepage?
    Thanks for any help,
    Chris

    1.
    For building integration apps you would need InterConnect and Database, preferably the iasdb ie. Infrastructure db, for message hub and hub repository (schema).
    Note that Metadata Repository is another name for the Database included in O9i AS (a 9.0.1 db) loaded with a bunch of "repositories" (schemas) for various AS components like Portal, Discoverer, OEM and Workflow.
    2.
    If you mean the InterConnect install type "Hub"- no.
    (but in the hub and spoke model, Metadata Repository is used as hub)
    In my experience, Oracle has a lot of work left to do on InterConnect. Documentation is useless - and now I'm being nice. Support looks weak and has failed miserably when dealing with basic stuff.
    (Perhaps this has something to do with OAI being hidden in Applications in previous incarnations?).

  • Firewall Ports Required for NAC manager to manage/add Cisco switch

    Hi,
    I am trying to add cisco switches to the NAM, however i am not able to add the switch as I am getting the error "unable to control switch" I have tried to open ports 161-162 on the firwall; if i was to allow any traffic between the NAM and switch, the cisco NAM is able to add/manage the switch.
    Not sure what other ports may be required for cisco NAM to manage the switch?
    Thanks.

    Hi,
    AFAIK, only the UDP ports 161-162 for the SNMP communication need to be open.
    Please make sure you have configured the correct port on the switch:
    (config)# snmp-server host 172.16.1.61 traps version 2c cam_v2 udp-port 162 mac-notification snmp
    If still not working i would check the logs on the firewall for any blocked traffic between the CAM and the switch.
    HTH,
    Tiago
    If  this helps you and/or  answers your question please mark the question  as "answered" and/or rate  it, so other users can easily find it.

  • Resource Requirement for a global zone

    Hi All,
    We are going to deploy Solaris container in our environment. However, we hope know if there any guideline for the resource requirement for setup the global zone. In other words, what are the min cpu / memory should be set assigned to the global zone?
    Thanks in advance.
    Regards

    You don't need to worry about defending the global zone. The design of the pool_default pool includes safeguards that prevent starvation by other pools. The most obvious is the zone strategy itself; there's no way to allocate all the physical memory to a zone before the kernel itself has pinned down what it needs.
    I think it would make better sense to allocate some (large) percentage of memory to the Oracle container and adjust as needed, rather than try and lock every bit of it. Maybe 14 GB and see how that works. I can think of any number of annoyances you might encounter by trying to lock down more memory than is actually available.

  • Ports required for Goldengate Setup 7809......

    hi,
    i am using default port 7809 on my server.we have also firewall. we have just open only one port.i.e 7809 Telnet srev2 7809... connected
    but problem in data pump process, can't be establish the Network connection with remote server.
    So is there any other ports that should be open on Source & Target server as well.
    Regards,
    AMSII

    1.Paste your source and target mgr.prm and data pump paramater files entries.
    2. What is the exact error message in report file?
    3. Did you open port for source system as well as target systems?
    for more details ,
    Assigning Manager a port for local communication
    The Manager process in each Oracle GoldenGate installation requires a dedicated port for communication between itself and other local Oracle GoldenGate processes. To specify this
    port, use the PORT parameter in the Manager parameter file. Follow these guidelines:
    1. The default port number for Manager is 7809. You must specify either the default port number (recommended, if available) or a different one of your choice.
    2. The port must be unreserved and unrestricted.
    3. Each Manager instance on a system must use a different port number.
    Use the DYNAMICPORTLIST parameter to specify a list of available ports to which the following,
    local Oracle GoldenGate processes can bind for communication with a remote Oracle GoldenGate process:
    ● Collector: to communicate with a remote Extract to receive incoming data.
    ● Replicat: to communicate with a remote Extract to receive data during an initial load task.
    ● Passive Extract: to communicate with a remote Collector
    ● GGSCI: to issue remote commands
    Hopefully this will help you
    Annamalai.

  • Ports required for communication between Web servers and service applications (the default is HTTP)

    We're using SharePoint 2010, I'm the system admin for a SharePoint farm. We enabled SharePoint Search by adding a Search Service APplication. One of the crawl report timer job is failing every 5 minte with the error "Cannot connect to remote server".
    After digging around, we found that the server running the timer job tries to connect to the SearchAdmin.svc on the index server, over HTTPS / port 32844 However, communication over SSL via a non default port is blocked by our firewall.
    According to this article: https://technet.microsoft.com/en-us/library/cc262849.aspx the default is HTTP for communication
    between web servers. How is it possible that it's trying to connect over SSL?

    Hi,
    Quoted from
    https://technet.microsoft.com/en-us/library/cc262849.aspx#ServiceApp :
    You can change the protocol and port binding for each service application. On the Service Applications page in Central Administration, select the service application, and then click
    Publish.
    Here is an article for configuring Windows firewall port rules for SharePoint using PowerShell in case you need:
    http://www.xylos.com/default.aspx?id=1050
    Regards,
    Rebecca Tu
    TechNet Community Support
    Please remember to mark the replies as answers if they help, and unmark the answers if they provide no help. If you have feedback for TechNet Support, contact
    [email protected]

  • Minimum How many dimension required for FDM integration Script

    Hi Gurus
    I have only 2 dimensions in my SQL Table name dbo.ABC (Example: 1.Entity 2.Account and amount(data value)
    Example:
    USA, SALES, 50000
    (Including value its total 3 dimensions)
    How to export this data to Target HFM Application.
    Integration Script got success when i click on validation it is shows only 2 dimension 1.Account 2.Entity. i have mapped correctly. but validation screen not showing anything. i got gold fish for validation button and Export is also showing success and got goldfish. but no data is exported to HFM application.
    in FDM outbox its created a file which is containing only *!data* text. There is no record in this file.
    I want to load the data with rest of the dimensions with [None] member combination as i don't have the additional dimensions in my source file.
    Minimum how many dimension required to export the data from FDM to HFM?
    regards
    Taruni

    Hi,
    I came to know, at least one member from the source file should be there in the integration script then only we can assign at least [None] member or any member for the target dimensions.
    My source file having only 3 dimensions ( USA,Sales,Amount)
    1.USA,2.Sales,3.$50000
    Import Screen Dimensions:
    1.Source-FM-Entity
    2.Source-FDM-Account
    3.Account Description
    4.SourceICP
    5.SourceCustom1
    6.SourceCustom2
    7.SourceCustom3
    8.SourceCustom4
    9.Amount
    In the integration script its taking the values as
    Source-FM-Entity(0)
    Source-FDM-Account(1)
    Account Description
    SourceICP
    SourceCustom1
    SourceCustom2
    SourceCustom3
    SourceCustom4
    Amount(2)
    above it shows only 0,1,2 numbers are assigned to source dimensions.
    As my source file having only 3 Dimension so it is taking only 3 dimensions shown below. rest of the dimensions it is not showing in the import screen.
    *0.Source-FM-Entity,1.Source-FDM-Account,2.Amount*
    If i assign any values(3-9) to next dimensions or if I left blank rs.fields("txtAcctDes") with its showing below error messages:
    Error: An error occurred importing the file.
    Detail: Item cannot be found in the collection corresponding to the requested name or ordinal.
    At line: (39 and 42-46)
    So i have assigned Source-FDM-Account Number<font color="Blue">(rs.fields(1) </font>Value to rest of the dimensions in my integration script.
    <font color="Blue">rsAppend.Fields("Account") = rs.fields(1).Value</font>
    rsAppend.Fields("Desc1") = rs.fields(1).Value
    rsAppend.Fields("ICP") = rs.fields(1).Value
    rsAppend.Fields("UD1") = rs.fields(1).Value
    rsAppend.Fields("UD2") = rs.fields(1).Value
    rsAppend.Fields("UD3") = rs.fields(1).Value
    rsAppend.Fields("UD4") = rs.fields(1).Value
    Now am able to import the data into import screen, And i found all the above member names as Sales as i assigned Account dimension number(1) to these members temporarily to succeed the import process . Then i have mapped to Target dimensions with [None] member combination as these members are not in original source file. Then rest of the process Export and Check is done perfectly.
    *<font color="red">1.Am i right?? Please suggest me the correct process?</font>*
    *<font color="red">2.Can we use blank values in Integration Script as mentioned below??</font>*
    rsAppend.Fields("Desc1") = rs.fields("txtAcctDes").Value
    rsAppend.Fields("Account") = rs.fields("txtAcct").Value
    rsAppend.Fields("Entity") = rs.fields("txtCenter").Value
    *1.Added value*
    Example: rsAppend.Fields("Desc1") = rs.fields("1").Value
    *2.Blank Value*
    rsAppend.Fields("Desc1") = rs.fields("txtAcctDes").Value
    *<font color="red">3.As per my observation system is not accepting blank values in integration script. Please correct me??</font>*
    Here is my Integration Script
    1     Function Integration(strLoc, lngCatKey, dblPerKey, strWorkTableName)
    2     '------------------------------------------------------------------
    3     'Oracle Hyperion FDM IMPORT Integration Script:
    4     Created By: admin
    5     Date Created: 2012-11-20-07:55:20
    6     'Purpose:
    7     '------------------------------------------------------------------
    8     Dim objSS 'ADODB.Connection
    9     Dim strSQL 'SQL String
    10     Dim rs 'Recordset
    11     Dim rsAppend 'tTB table append rs Object
    12     'Initialize objects
    13     Set cnSS = CreateObject("ADODB.Connection")
    14     Set rs = CreateObject("ADODB.Recordset")
    15     Set rsAppend = DW.DataAccess.farsTable(strWorkTableName)
    16     'Connect To SQL Server database
    17     cnss.open "Provider=SQLOLEDB.1;Integrated Security=SSPI;Persist Security Info=False;Initial Catalog=TEST;Data Source=localhost;"
    18     strSQL = "Select * "
    19     strSQL = strSQL & "FROM ABC"
    20     'Get data
    21     rs.Open strSQL, cnSS
    22     'Check For data
    23     If rs.bof And rs.eof Then
    24     RES.PlngActionType = 2
    25     RES.PstrActionValue = "No Records To load!"
    26     FirstImportVB = False ' Assign return value of function
    27     Exit Function
    28     End If
    29     'Loop through records And append To tTB table In location’s DB
    30     If Not rs.bof And Not rs.eof Then
    31     Do While Not rs.eof
    32     rsAppend.AddNew
    33     rsAppend.Fields("PartitionKey") = RES.PlngLocKey
    34     rsAppend.Fields("catKey") = lngCatKey
    35     rsAppend.Fields("PeriodKey") =dblPerKey
    36     rsAppend.Fields("DataView") = "YTD"
    37     rsAppend.Fields("CalcAcctType") = 9
    38     rsAppend.Fields("Amount") = rs.fields(2).Value
    39     rsAppend.Fields("Desc1") = rs.fields(1).Value
    40     rsAppend.Fields("Account") = rs.fields(1).Value
    41     rsAppend.Fields("Entity") = rs.fields(0).Value
    42     rsAppend.Fields("ICP") = rs.fields(1).Value
    43     rsAppend.Fields("UD1") = rs.fields(1).Value
    44     rsAppend.Fields("UD2") = rs.fields(1).Value
    45     rsAppend.Fields("UD3") = rs.fields(1).Value
    46     rsAppend.Fields("UD4") = rs.fields(1).Value
    47     rsAppend.Update
    48     rs.movenext
    49     Loop
    50     End If
    51     'Records loaded
    52     RES.PlngActionType = 2
    53     RES.PstrActionValue = "SQL Import successful!"
    54     'Assign Return value
    55     Integration = True
    56     End Function
    Regards
    Taruni

Maybe you are looking for

  • Need help in creating clouds

    Hey - can anyone help me out in my attempt to create clouds similar to the ones shown here - http://www.yeswade.com/clouds.html I have fireworks 8, with numerous alien skin filters, and also Photoshop CS2 if that is needed. Thanks for your suggestion

  • Pls sugg.modification of an orded prod is sevice or std purchase

    hi dear frds, we have raised one po against the butterfly valve manufacturer(vendor).according to PO they have manaufactured and made it ready.but before delivery of valves,we have noticed that some modifications to be done in this design of valves.s

  • Dynamic Tables (insertrow/deleterow)

    I have a rqmt to accept a variable number of data points using a form. I want to use dynamic tables to allow the user to add or delete table rows as needed (via Javascript). Example - View Rendered with: <tr> <td><input type="text" name="dataARow1" s

  • MacBook won't start up after installing iTunes 8 & updates ...??

    My MacBook is a little more than a year old (read, no longer under warranty ...) and it's currently in a coma. Nothing was out of the ordinary prior to the MacBook going weird - I downloaded a bunch of photos from my camera, ejected a USB drive and i

  • IPhone 5 power solution

    I am having a problem with my iPhone 5 randomly shutting off even after having my battery replaced. It sometimes will not turn back on for days. I found that applying heat to the phone gets it to power on, but as soon as it cools down it shuts down.